Nov 22 09:13:43 crc systemd[1]: Starting Kubernetes Kubelet... Nov 22 09:13:43 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:43 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 09:13:44 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 22 09:13:45 crc kubenswrapper[4846]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 09:13:45 crc kubenswrapper[4846]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 22 09:13:45 crc kubenswrapper[4846]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 09:13:45 crc kubenswrapper[4846]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 09:13:45 crc kubenswrapper[4846]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 22 09:13:45 crc kubenswrapper[4846]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.720979 4846 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729489 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729536 4846 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729546 4846 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729555 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729564 4846 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729574 4846 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729582 4846 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729590 4846 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729599 4846 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729607 4846 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729621 4846 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729632 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729641 4846 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729650 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729658 4846 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729667 4846 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729675 4846 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729686 4846 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729712 4846 feature_gate.go:330] unrecognized feature gate: Example Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729722 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729730 4846 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729739 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729747 4846 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729755 4846 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729763 4846 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729772 4846 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729780 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729788 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729797 4846 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729805 4846 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729814 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729822 4846 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729831 4846 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729839 4846 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729860 4846 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729881 4846 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729890 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729899 4846 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729908 4846 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729916 4846 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729924 4846 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729933 4846 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729943 4846 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729952 4846 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729960 4846 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729971 4846 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729979 4846 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729988 4846 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.729996 4846 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730004 4846 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730011 4846 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730020 4846 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730028 4846 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730036 4846 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730071 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730080 4846 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730089 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730098 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730108 4846 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730116 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730125 4846 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730133 4846 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730141 4846 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730150 4846 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730162 4846 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730173 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730185 4846 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730195 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730204 4846 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730215 4846 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.730239 4846 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732309 4846 flags.go:64] FLAG: --address="0.0.0.0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732347 4846 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732378 4846 flags.go:64] FLAG: --anonymous-auth="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732391 4846 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732405 4846 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732416 4846 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732430 4846 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732442 4846 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732452 4846 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732463 4846 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732473 4846 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732484 4846 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732494 4846 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732505 4846 flags.go:64] FLAG: --cgroup-root="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732516 4846 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732526 4846 flags.go:64] FLAG: --client-ca-file="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732535 4846 flags.go:64] FLAG: --cloud-config="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732545 4846 flags.go:64] FLAG: --cloud-provider="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732555 4846 flags.go:64] FLAG: --cluster-dns="[]" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732572 4846 flags.go:64] FLAG: --cluster-domain="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732582 4846 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732592 4846 flags.go:64] FLAG: --config-dir="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732602 4846 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732613 4846 flags.go:64] FLAG: --container-log-max-files="5" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732625 4846 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732635 4846 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732645 4846 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732656 4846 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732667 4846 flags.go:64] FLAG: --contention-profiling="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732677 4846 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732686 4846 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732696 4846 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732707 4846 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732721 4846 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732732 4846 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732755 4846 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732766 4846 flags.go:64] FLAG: --enable-load-reader="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732776 4846 flags.go:64] FLAG: --enable-server="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732786 4846 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732804 4846 flags.go:64] FLAG: --event-burst="100" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732814 4846 flags.go:64] FLAG: --event-qps="50" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732824 4846 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732835 4846 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732844 4846 flags.go:64] FLAG: --eviction-hard="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732856 4846 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732866 4846 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732875 4846 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732886 4846 flags.go:64] FLAG: --eviction-soft="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732896 4846 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732906 4846 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732915 4846 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732925 4846 flags.go:64] FLAG: --experimental-mounter-path="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732935 4846 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732944 4846 flags.go:64] FLAG: --fail-swap-on="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732954 4846 flags.go:64] FLAG: --feature-gates="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732966 4846 flags.go:64] FLAG: --file-check-frequency="20s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732976 4846 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732985 4846 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.732995 4846 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733005 4846 flags.go:64] FLAG: --healthz-port="10248" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733016 4846 flags.go:64] FLAG: --help="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733025 4846 flags.go:64] FLAG: --hostname-override="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733035 4846 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733074 4846 flags.go:64] FLAG: --http-check-frequency="20s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733085 4846 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733095 4846 flags.go:64] FLAG: --image-credential-provider-config="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733104 4846 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733118 4846 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733128 4846 flags.go:64] FLAG: --image-service-endpoint="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733137 4846 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733147 4846 flags.go:64] FLAG: --kube-api-burst="100" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733170 4846 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733181 4846 flags.go:64] FLAG: --kube-api-qps="50" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733192 4846 flags.go:64] FLAG: --kube-reserved="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733202 4846 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733212 4846 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733222 4846 flags.go:64] FLAG: --kubelet-cgroups="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733232 4846 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733241 4846 flags.go:64] FLAG: --lock-file="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733251 4846 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733260 4846 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733271 4846 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733286 4846 flags.go:64] FLAG: --log-json-split-stream="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733308 4846 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733318 4846 flags.go:64] FLAG: --log-text-split-stream="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733328 4846 flags.go:64] FLAG: --logging-format="text" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733338 4846 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733348 4846 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733358 4846 flags.go:64] FLAG: --manifest-url="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733367 4846 flags.go:64] FLAG: --manifest-url-header="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733379 4846 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733389 4846 flags.go:64] FLAG: --max-open-files="1000000" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733401 4846 flags.go:64] FLAG: --max-pods="110" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733411 4846 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733427 4846 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733436 4846 flags.go:64] FLAG: --memory-manager-policy="None" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733447 4846 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733458 4846 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733468 4846 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733478 4846 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733500 4846 flags.go:64] FLAG: --node-status-max-images="50" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733509 4846 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733519 4846 flags.go:64] FLAG: --oom-score-adj="-999" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733530 4846 flags.go:64] FLAG: --pod-cidr="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733539 4846 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733555 4846 flags.go:64] FLAG: --pod-manifest-path="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733565 4846 flags.go:64] FLAG: --pod-max-pids="-1" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733587 4846 flags.go:64] FLAG: --pods-per-core="0" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733597 4846 flags.go:64] FLAG: --port="10250" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733607 4846 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733617 4846 flags.go:64] FLAG: --provider-id="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733627 4846 flags.go:64] FLAG: --qos-reserved="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733636 4846 flags.go:64] FLAG: --read-only-port="10255" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733646 4846 flags.go:64] FLAG: --register-node="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733656 4846 flags.go:64] FLAG: --register-schedulable="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733665 4846 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733681 4846 flags.go:64] FLAG: --registry-burst="10" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733691 4846 flags.go:64] FLAG: --registry-qps="5" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733701 4846 flags.go:64] FLAG: --reserved-cpus="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733710 4846 flags.go:64] FLAG: --reserved-memory="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733722 4846 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733743 4846 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733754 4846 flags.go:64] FLAG: --rotate-certificates="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733764 4846 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733774 4846 flags.go:64] FLAG: --runonce="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733784 4846 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733794 4846 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733804 4846 flags.go:64] FLAG: --seccomp-default="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733814 4846 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733824 4846 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733834 4846 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733845 4846 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733856 4846 flags.go:64] FLAG: --storage-driver-password="root" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733866 4846 flags.go:64] FLAG: --storage-driver-secure="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733875 4846 flags.go:64] FLAG: --storage-driver-table="stats" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733885 4846 flags.go:64] FLAG: --storage-driver-user="root" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733895 4846 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733905 4846 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733915 4846 flags.go:64] FLAG: --system-cgroups="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733924 4846 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733939 4846 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733949 4846 flags.go:64] FLAG: --tls-cert-file="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.733958 4846 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734006 4846 flags.go:64] FLAG: --tls-min-version="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734016 4846 flags.go:64] FLAG: --tls-private-key-file="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734025 4846 flags.go:64] FLAG: --topology-manager-policy="none" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734035 4846 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734074 4846 flags.go:64] FLAG: --topology-manager-scope="container" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734085 4846 flags.go:64] FLAG: --v="2" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734098 4846 flags.go:64] FLAG: --version="false" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734110 4846 flags.go:64] FLAG: --vmodule="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734121 4846 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.734131 4846 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734435 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734447 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734457 4846 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734468 4846 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734477 4846 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734487 4846 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734495 4846 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734504 4846 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734512 4846 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734521 4846 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734530 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734541 4846 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734552 4846 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734562 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734571 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734580 4846 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734589 4846 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734597 4846 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734606 4846 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734614 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734622 4846 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734631 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734639 4846 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734647 4846 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734658 4846 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734669 4846 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734692 4846 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734702 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734711 4846 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734719 4846 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734727 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734736 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734745 4846 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734753 4846 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734761 4846 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734769 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734778 4846 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734786 4846 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734795 4846 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734805 4846 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734813 4846 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734822 4846 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734831 4846 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734840 4846 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734848 4846 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734857 4846 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734865 4846 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734873 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734882 4846 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734890 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734898 4846 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734906 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734915 4846 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734923 4846 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734931 4846 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734940 4846 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734949 4846 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734957 4846 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734966 4846 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734975 4846 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734983 4846 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.734991 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735012 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735024 4846 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735036 4846 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735069 4846 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735080 4846 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735089 4846 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735098 4846 feature_gate.go:330] unrecognized feature gate: Example Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735107 4846 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.735116 4846 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.735143 4846 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.751912 4846 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.751969 4846 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752255 4846 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752272 4846 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752278 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752284 4846 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752326 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752334 4846 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752341 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752353 4846 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752359 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752666 4846 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752679 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752684 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752690 4846 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752696 4846 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752701 4846 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752711 4846 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752720 4846 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752729 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752735 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752741 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752747 4846 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752752 4846 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752759 4846 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752764 4846 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752769 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752776 4846 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752781 4846 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752787 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752793 4846 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752799 4846 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752806 4846 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752811 4846 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752816 4846 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752823 4846 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752831 4846 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752837 4846 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752844 4846 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752849 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752857 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752863 4846 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752868 4846 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752874 4846 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752879 4846 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752905 4846 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752911 4846 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752916 4846 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752921 4846 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752927 4846 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752932 4846 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752938 4846 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752944 4846 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752951 4846 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752958 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752963 4846 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752968 4846 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752973 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752978 4846 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752982 4846 feature_gate.go:330] unrecognized feature gate: Example Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752987 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752992 4846 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.752997 4846 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753002 4846 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753007 4846 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753013 4846 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753019 4846 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753025 4846 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753031 4846 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753036 4846 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753059 4846 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753065 4846 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753072 4846 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.753082 4846 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753250 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753260 4846 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753265 4846 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753270 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753276 4846 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753282 4846 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753287 4846 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753292 4846 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753297 4846 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753302 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753308 4846 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753313 4846 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753318 4846 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753323 4846 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753328 4846 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753333 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753338 4846 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753343 4846 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753347 4846 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753352 4846 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753357 4846 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753362 4846 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753367 4846 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753372 4846 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753377 4846 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753382 4846 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753387 4846 feature_gate.go:330] unrecognized feature gate: Example Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753391 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753396 4846 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753401 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753405 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753411 4846 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753416 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753420 4846 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753425 4846 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753430 4846 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753435 4846 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753440 4846 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753444 4846 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753451 4846 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753457 4846 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753463 4846 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753469 4846 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753476 4846 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753484 4846 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753491 4846 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753497 4846 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753503 4846 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753508 4846 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753513 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753518 4846 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753523 4846 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753528 4846 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753533 4846 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753538 4846 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753543 4846 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753549 4846 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753556 4846 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753560 4846 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753565 4846 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753570 4846 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753575 4846 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753580 4846 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753584 4846 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753590 4846 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753596 4846 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753601 4846 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753607 4846 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753613 4846 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753619 4846 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.753626 4846 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.753635 4846 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.753866 4846 server.go:940] "Client rotation is on, will bootstrap in background" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.758549 4846 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.758657 4846 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.762443 4846 server.go:997] "Starting client certificate rotation" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.762488 4846 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.763564 4846 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-07 22:05:08.527960941 +0000 UTC Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.763736 4846 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 372h51m22.764228659s for next certificate rotation Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.806206 4846 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.810376 4846 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.835966 4846 log.go:25] "Validated CRI v1 runtime API" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.895534 4846 log.go:25] "Validated CRI v1 image API" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.897455 4846 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.906818 4846 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-22-09-08-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.906863 4846 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.933307 4846 manager.go:217] Machine: {Timestamp:2025-11-22 09:13:45.930388476 +0000 UTC m=+0.866078205 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:fa21c007-e82e-49e1-be5e-f6cba7f9397a BootID:ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a1:0c:53 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a1:0c:53 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:14:a8:4e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2e:41:8d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cf:ac:2d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:78:e9:87 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9a:ed:7e:20:50:d5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:d1:9e:10:aa:a3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.933737 4846 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.933974 4846 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.935038 4846 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.935346 4846 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.935401 4846 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.935743 4846 topology_manager.go:138] "Creating topology manager with none policy" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.935762 4846 container_manager_linux.go:303] "Creating device plugin manager" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.936408 4846 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.936491 4846 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.937770 4846 state_mem.go:36] "Initialized new in-memory state store" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.938419 4846 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.946462 4846 kubelet.go:418] "Attempting to sync node with API server" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.946719 4846 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.946758 4846 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.946786 4846 kubelet.go:324] "Adding apiserver pod source" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.946803 4846 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.960640 4846 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.961925 4846 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.963799 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:45 crc kubenswrapper[4846]: E1122 09:13:45.963937 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.964072 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:45 crc kubenswrapper[4846]: E1122 09:13:45.964188 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.966077 4846 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968139 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968230 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968325 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968399 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968473 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968539 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968603 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968668 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968730 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968800 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968866 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.968926 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.970490 4846 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.971238 4846 server.go:1280] "Started kubelet" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.971376 4846 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.971548 4846 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.972478 4846 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.972497 4846 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.973649 4846 server.go:460] "Adding debug handlers to kubelet server" Nov 22 09:13:45 crc systemd[1]: Started Kubernetes Kubelet. Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.974194 4846 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.974248 4846 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.974383 4846 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:50:26.180023624 +0000 UTC Nov 22 09:13:45 crc kubenswrapper[4846]: E1122 09:13:45.974479 4846 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.974503 4846 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 380h36m40.205527136s for next certificate rotation Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.975306 4846 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.975432 4846 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.975370 4846 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 22 09:13:45 crc kubenswrapper[4846]: W1122 09:13:45.976362 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:45 crc kubenswrapper[4846]: E1122 09:13:45.976494 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:45 crc kubenswrapper[4846]: E1122 09:13:45.976888 4846 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.107:6443: connect: connection refused" interval="200ms" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979378 4846 factory.go:153] Registering CRI-O factory Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979411 4846 factory.go:221] Registration of the crio container factory successfully Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979490 4846 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979509 4846 factory.go:55] Registering systemd factory Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979519 4846 factory.go:221] Registration of the systemd container factory successfully Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979557 4846 factory.go:103] Registering Raw factory Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.979579 4846 manager.go:1196] Started watching for new ooms in manager Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.988532 4846 manager.go:319] Starting recovery of all containers Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994304 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994381 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994400 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994416 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994432 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994446 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994460 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994474 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994496 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994514 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994527 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994543 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994558 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994576 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994589 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994604 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994619 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994634 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994651 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994667 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994679 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994695 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994713 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994728 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994742 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994758 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994777 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994793 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994808 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994837 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994850 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994864 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994901 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994915 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994958 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994975 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.994990 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995071 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995093 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995107 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995121 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995134 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995208 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995226 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995244 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995258 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995272 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995288 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995302 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995316 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995330 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995345 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995366 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995383 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995403 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995418 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995438 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995454 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995471 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995487 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995504 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995520 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995533 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995546 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995562 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995579 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995593 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995606 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995645 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: E1122 09:13:45.993072 4846 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.107:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a49533fd7e538 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 09:13:45.97119724 +0000 UTC m=+0.906886899,LastTimestamp:2025-11-22 09:13:45.97119724 +0000 UTC m=+0.906886899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995662 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995852 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995918 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995957 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.995991 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996023 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996115 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996149 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996181 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996211 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996243 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996276 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996310 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996345 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996380 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996413 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996460 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996495 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996529 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996562 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996592 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996625 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996656 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996688 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996723 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996753 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996788 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996820 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996852 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996891 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996933 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.996965 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997001 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997033 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997122 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997174 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997211 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997250 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997284 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997321 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997359 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997392 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997425 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997460 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997492 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997527 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997561 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997592 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997626 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997743 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997773 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997806 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997838 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997879 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997907 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997939 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997968 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.997996 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998025 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998089 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998119 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998148 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998182 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998206 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998232 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998267 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998298 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 22 09:13:45 crc kubenswrapper[4846]: I1122 09:13:45.998327 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002265 4846 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002347 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002372 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002391 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002417 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002483 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002496 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002511 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002532 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002547 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002564 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002583 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002601 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002617 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002630 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002646 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002659 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002676 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002692 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002705 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002720 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002734 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002748 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002769 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002789 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002803 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002821 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002836 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002863 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002878 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002891 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002905 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002919 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002933 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002946 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002961 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002975 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.002993 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003006 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003019 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003032 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003091 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003106 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003121 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003135 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003150 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003163 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003176 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003191 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003203 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003216 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003229 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003249 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003263 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003277 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003290 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003307 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003322 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003335 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003349 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003368 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003388 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003407 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003422 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003451 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003469 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003486 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003504 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003520 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003535 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003549 4846 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003562 4846 reconstruct.go:97] "Volume reconstruction finished" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.003571 4846 reconciler.go:26] "Reconciler: start to sync state" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.012726 4846 manager.go:324] Recovery completed Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.028379 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.030950 4846 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.032180 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.032254 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.032275 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.033791 4846 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.033856 4846 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.033891 4846 kubelet.go:2335] "Starting kubelet main sync loop" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.033949 4846 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 22 09:13:46 crc kubenswrapper[4846]: W1122 09:13:46.034685 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.034742 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.035339 4846 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.035360 4846 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.035386 4846 state_mem.go:36] "Initialized new in-memory state store" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.064427 4846 policy_none.go:49] "None policy: Start" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.065683 4846 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.065719 4846 state_mem.go:35] "Initializing new in-memory state store" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.075035 4846 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.135095 4846 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.146690 4846 manager.go:334] "Starting Device Plugin manager" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.146812 4846 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.146830 4846 server.go:79] "Starting device plugin registration server" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.147525 4846 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.147549 4846 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.147939 4846 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.148070 4846 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.148088 4846 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.157689 4846 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.177819 4846 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.107:6443: connect: connection refused" interval="400ms" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.249446 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.252171 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.252224 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.252235 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.252266 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.252914 4846 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.107:6443: connect: connection refused" node="crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.335879 4846 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.336078 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.337999 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.338084 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.338102 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.338290 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.338574 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.338647 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339435 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339493 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339681 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339873 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339935 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.339949 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.340075 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.340163 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.340872 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.340917 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.340929 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.341159 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.341350 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.341399 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.341806 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.341932 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342023 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342079 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342133 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342224 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342369 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342407 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342411 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.342463 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.343196 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.343244 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.343259 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344061 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344101 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344115 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344297 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344334 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344926 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.344956 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.406928 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.406977 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407002 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407021 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407113 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407153 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407195 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407215 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407254 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407279 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407305 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407351 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407373 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407395 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.407415 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.453148 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.454419 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.454480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.454498 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.454536 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.454999 4846 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.107:6443: connect: connection refused" node="crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.508977 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509025 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509064 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509083 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509102 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509120 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509130 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509152 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509148 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509181 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509332 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509352 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509366 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509384 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509402 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509421 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509257 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509452 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509437 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509460 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509518 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509446 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509484 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509500 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509240 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509248 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509506 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509525 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509254 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.509692 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.532309 4846 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.107:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a49533fd7e538 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 09:13:45.97119724 +0000 UTC m=+0.906886899,LastTimestamp:2025-11-22 09:13:45.97119724 +0000 UTC m=+0.906886899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.578644 4846 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.107:6443: connect: connection refused" interval="800ms" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.669262 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.696208 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.714309 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: W1122 09:13:46.714665 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-94f9eae051c670c0a48666c834ef02b61c4036af66bb2be5b68a26c128cb2216 WatchSource:0}: Error finding container 94f9eae051c670c0a48666c834ef02b61c4036af66bb2be5b68a26c128cb2216: Status 404 returned error can't find the container with id 94f9eae051c670c0a48666c834ef02b61c4036af66bb2be5b68a26c128cb2216 Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.719865 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.735470 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:46 crc kubenswrapper[4846]: W1122 09:13:46.736623 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0dd784419f795498f02a2756ec9397c621c6fa745da65caef1b1d0bfa5e9ce9a WatchSource:0}: Error finding container 0dd784419f795498f02a2756ec9397c621c6fa745da65caef1b1d0bfa5e9ce9a: Status 404 returned error can't find the container with id 0dd784419f795498f02a2756ec9397c621c6fa745da65caef1b1d0bfa5e9ce9a Nov 22 09:13:46 crc kubenswrapper[4846]: W1122 09:13:46.738381 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2365c0dfd0457ddc75a53669ad7eb5e10f9b6e59b90c85bafc9c303827e6eadb WatchSource:0}: Error finding container 2365c0dfd0457ddc75a53669ad7eb5e10f9b6e59b90c85bafc9c303827e6eadb: Status 404 returned error can't find the container with id 2365c0dfd0457ddc75a53669ad7eb5e10f9b6e59b90c85bafc9c303827e6eadb Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.855854 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.857389 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.857420 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.857428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.857458 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:13:46 crc kubenswrapper[4846]: E1122 09:13:46.858012 4846 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.107:6443: connect: connection refused" node="crc" Nov 22 09:13:46 crc kubenswrapper[4846]: I1122 09:13:46.973301 4846 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:47 crc kubenswrapper[4846]: W1122 09:13:47.002640 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:47 crc kubenswrapper[4846]: E1122 09:13:47.002772 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.042737 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"94f9eae051c670c0a48666c834ef02b61c4036af66bb2be5b68a26c128cb2216"} Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.043754 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"68a9ac7a23a688010b4f03f3f7f7487f98a8df81b69dc0c102f3466a8b7b3f51"} Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.045631 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2365c0dfd0457ddc75a53669ad7eb5e10f9b6e59b90c85bafc9c303827e6eadb"} Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.046766 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0dd784419f795498f02a2756ec9397c621c6fa745da65caef1b1d0bfa5e9ce9a"} Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.047810 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8d0f4ecf27ad02e4c3a2e2a599eb1bfc3499b29ae31a966aa5930d5ff47f9981"} Nov 22 09:13:47 crc kubenswrapper[4846]: W1122 09:13:47.338099 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:47 crc kubenswrapper[4846]: E1122 09:13:47.338567 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:47 crc kubenswrapper[4846]: E1122 09:13:47.380126 4846 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.107:6443: connect: connection refused" interval="1.6s" Nov 22 09:13:47 crc kubenswrapper[4846]: W1122 09:13:47.522467 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:47 crc kubenswrapper[4846]: E1122 09:13:47.522545 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:47 crc kubenswrapper[4846]: W1122 09:13:47.550684 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:47 crc kubenswrapper[4846]: E1122 09:13:47.550820 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.658479 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.660141 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.660187 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.660196 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.660224 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:13:47 crc kubenswrapper[4846]: E1122 09:13:47.660775 4846 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.107:6443: connect: connection refused" node="crc" Nov 22 09:13:47 crc kubenswrapper[4846]: I1122 09:13:47.974450 4846 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.054280 4846 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4" exitCode=0 Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.054381 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.054491 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.056947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.056996 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.057009 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.057322 4846 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589" exitCode=0 Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.057426 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.057494 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.058872 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059446 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059499 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059519 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059774 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059815 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059826 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059917 4846 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="96eb08f46c24a133d77a59e309cc57196d612d2b5138f3af5672f202a6d6fe61" exitCode=0 Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.059989 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"96eb08f46c24a133d77a59e309cc57196d612d2b5138f3af5672f202a6d6fe61"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.060021 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.061733 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.061777 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.061791 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.064483 4846 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb" exitCode=0 Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.064803 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.064931 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.067202 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.067240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.067252 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.069832 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.069908 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.069926 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5"} Nov 22 09:13:48 crc kubenswrapper[4846]: I1122 09:13:48.973825 4846 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:48 crc kubenswrapper[4846]: E1122 09:13:48.981814 4846 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.107:6443: connect: connection refused" interval="3.2s" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.076101 4846 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1" exitCode=0 Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.076189 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.076403 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.077579 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.077684 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.077751 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.078009 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"998ada769df7495241dc7b742cbab76b5bc62cf61517ff041377d23245269499"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.078147 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.079220 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.079256 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.079269 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.080429 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.080536 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.080545 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.080774 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.082345 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.082374 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.082385 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.086152 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.086182 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.089345 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.089396 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.089410 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.091292 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.091348 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.091383 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.091396 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7"} Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.261582 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.263758 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.263846 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.263871 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.263937 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:13:49 crc kubenswrapper[4846]: E1122 09:13:49.264693 4846 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.107:6443: connect: connection refused" node="crc" Nov 22 09:13:49 crc kubenswrapper[4846]: W1122 09:13:49.684766 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:49 crc kubenswrapper[4846]: E1122 09:13:49.684844 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:49 crc kubenswrapper[4846]: W1122 09:13:49.745554 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:49 crc kubenswrapper[4846]: E1122 09:13:49.745627 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.107:6443: connect: connection refused" logger="UnhandledError" Nov 22 09:13:49 crc kubenswrapper[4846]: I1122 09:13:49.973587 4846 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.107:6443: connect: connection refused Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.095816 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.098677 4846 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9" exitCode=255 Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.098741 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9"} Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.098799 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.099896 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.099952 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.099962 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.100697 4846 scope.go:117] "RemoveContainer" containerID="0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.101927 4846 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01" exitCode=0 Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.102030 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.102015 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01"} Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.102089 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.102113 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.102259 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.102269 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103376 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103402 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103413 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103469 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103507 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103526 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103593 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103614 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.103626 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.104481 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.104520 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:50 crc kubenswrapper[4846]: I1122 09:13:50.104532 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.105965 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.107223 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4"} Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.107390 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.107720 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.108531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.108565 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.108583 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113333 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113446 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113728 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"97534396f697eeceaa32da9cb02f110cc9a3bbe90f98b8ccfac13d2d692fd13a"} Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113761 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ba5b11f0d56e4061408321dc3a81fa978fe16802d273f013ed469ddd30fd59c9"} Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113789 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bc9369cc8aeb5063c77b2903f11b072e13ad00b65213695b24aece2781447c6e"} Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113801 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ce3a473d395602dc4a069e17443c0dd35f1a07bf91449ffd385d2c33d15f42d"} Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.113811 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"460f9a7df0be432a0284ae062957ae60158969f4b6cb7e8ae907d10ddd488a00"} Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.114195 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.114218 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.114230 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.114953 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.114983 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.114996 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.144784 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.327593 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.328075 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.329960 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.330022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:51 crc kubenswrapper[4846]: I1122 09:13:51.330076 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.116301 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.116369 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.116367 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.117710 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.117755 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.117766 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.117716 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.117801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.117814 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.465409 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.466635 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.466672 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.466683 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.466709 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:13:52 crc kubenswrapper[4846]: I1122 09:13:52.774133 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.119211 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.119211 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.120137 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.120174 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.120184 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.120337 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.120371 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.120383 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:53 crc kubenswrapper[4846]: I1122 09:13:53.839266 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:13:54 crc kubenswrapper[4846]: I1122 09:13:54.121928 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:54 crc kubenswrapper[4846]: I1122 09:13:54.122932 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:54 crc kubenswrapper[4846]: I1122 09:13:54.122967 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:54 crc kubenswrapper[4846]: I1122 09:13:54.122981 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:54 crc kubenswrapper[4846]: I1122 09:13:54.327753 4846 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 09:13:54 crc kubenswrapper[4846]: I1122 09:13:54.327878 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 09:13:55 crc kubenswrapper[4846]: I1122 09:13:55.124228 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:55 crc kubenswrapper[4846]: I1122 09:13:55.125421 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:55 crc kubenswrapper[4846]: I1122 09:13:55.125496 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:55 crc kubenswrapper[4846]: I1122 09:13:55.125517 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:56 crc kubenswrapper[4846]: E1122 09:13:56.158542 4846 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.286595 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.286835 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.288469 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.288520 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.288531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.291492 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.682517 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.686845 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:57 crc kubenswrapper[4846]: I1122 09:13:57.809195 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:13:58 crc kubenswrapper[4846]: I1122 09:13:58.131171 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:58 crc kubenswrapper[4846]: I1122 09:13:58.132606 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:58 crc kubenswrapper[4846]: I1122 09:13:58.132647 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:58 crc kubenswrapper[4846]: I1122 09:13:58.132658 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:13:59 crc kubenswrapper[4846]: I1122 09:13:59.133915 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:13:59 crc kubenswrapper[4846]: I1122 09:13:59.135591 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:13:59 crc kubenswrapper[4846]: I1122 09:13:59.135653 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:13:59 crc kubenswrapper[4846]: I1122 09:13:59.135671 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.026755 4846 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.026830 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 22 09:14:00 crc kubenswrapper[4846]: W1122 09:14:00.307182 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.308120 4846 trace.go:236] Trace[485113793]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 09:13:50.305) (total time: 10003ms): Nov 22 09:14:00 crc kubenswrapper[4846]: Trace[485113793]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (09:14:00.307) Nov 22 09:14:00 crc kubenswrapper[4846]: Trace[485113793]: [10.003061592s] [10.003061592s] END Nov 22 09:14:00 crc kubenswrapper[4846]: E1122 09:14:00.308265 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.493959 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.494353 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.496443 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.496491 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.496502 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:00 crc kubenswrapper[4846]: W1122 09:14:00.553333 4846 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.553460 4846 trace.go:236] Trace[295687924]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 09:13:50.552) (total time: 10001ms): Nov 22 09:14:00 crc kubenswrapper[4846]: Trace[295687924]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:14:00.553) Nov 22 09:14:00 crc kubenswrapper[4846]: Trace[295687924]: [10.001308181s] [10.001308181s] END Nov 22 09:14:00 crc kubenswrapper[4846]: E1122 09:14:00.553486 4846 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.570631 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.673065 4846 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.673162 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.679206 4846 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 09:14:00 crc kubenswrapper[4846]: I1122 09:14:00.679266 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 09:14:01 crc kubenswrapper[4846]: I1122 09:14:01.139880 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:14:01 crc kubenswrapper[4846]: I1122 09:14:01.141140 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:01 crc kubenswrapper[4846]: I1122 09:14:01.141179 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:01 crc kubenswrapper[4846]: I1122 09:14:01.141190 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:01 crc kubenswrapper[4846]: I1122 09:14:01.154388 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.141972 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.142791 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.142831 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.142843 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.779696 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.779881 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.781198 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.781240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.781252 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:02 crc kubenswrapper[4846]: I1122 09:14:02.784072 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:03 crc kubenswrapper[4846]: I1122 09:14:03.143967 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:14:03 crc kubenswrapper[4846]: I1122 09:14:03.145160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:03 crc kubenswrapper[4846]: I1122 09:14:03.145194 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:03 crc kubenswrapper[4846]: I1122 09:14:03.145204 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:04 crc kubenswrapper[4846]: I1122 09:14:04.328794 4846 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 09:14:04 crc kubenswrapper[4846]: I1122 09:14:04.328883 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 09:14:04 crc kubenswrapper[4846]: I1122 09:14:04.592904 4846 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 22 09:14:05 crc kubenswrapper[4846]: E1122 09:14:05.663206 4846 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.667184 4846 trace.go:236] Trace[1815025913]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 09:13:54.119) (total time: 11547ms): Nov 22 09:14:05 crc kubenswrapper[4846]: Trace[1815025913]: ---"Objects listed" error: 11547ms (09:14:05.667) Nov 22 09:14:05 crc kubenswrapper[4846]: Trace[1815025913]: [11.547264628s] [11.547264628s] END Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.667219 4846 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.667215 4846 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 22 09:14:05 crc kubenswrapper[4846]: E1122 09:14:05.667577 4846 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.669923 4846 trace.go:236] Trace[368844146]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 09:13:55.481) (total time: 10188ms): Nov 22 09:14:05 crc kubenswrapper[4846]: Trace[368844146]: ---"Objects listed" error: 10188ms (09:14:05.669) Nov 22 09:14:05 crc kubenswrapper[4846]: Trace[368844146]: [10.188576267s] [10.188576267s] END Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.669945 4846 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.783678 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.800895 4846 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58856->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.800968 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58856->192.168.126.11:17697: read: connection reset by peer" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.960212 4846 apiserver.go:52] "Watching apiserver" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.963459 4846 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.963908 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-q52w8","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.964384 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.964463 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.964510 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:05 crc kubenswrapper[4846]: E1122 09:14:05.964563 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:05 crc kubenswrapper[4846]: E1122 09:14:05.964562 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.964688 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.964763 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.965353 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:05 crc kubenswrapper[4846]: E1122 09:14:05.965426 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.965674 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.967678 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.967714 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.967716 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.967736 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.968421 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.968769 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.968805 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.968875 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.968964 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.969000 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.969160 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.972248 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.976474 4846 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 22 09:14:05 crc kubenswrapper[4846]: I1122 09:14:05.988754 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.001690 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.012400 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.027439 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.039279 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.053357 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.062784 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.068960 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069013 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069031 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069068 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069087 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069109 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069124 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069144 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069165 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069182 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069198 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069217 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069235 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069254 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069272 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069290 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069319 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069338 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069356 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069376 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069394 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069412 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069437 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069453 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069475 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069496 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069516 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069514 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069559 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069514 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069543 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069676 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069685 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069756 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069700 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069788 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069811 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069863 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069849 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069910 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069939 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069963 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.069988 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070010 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070028 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070034 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070082 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070106 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070130 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070156 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070178 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070207 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070237 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070264 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070314 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070353 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070379 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070357 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070399 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070421 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070440 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070468 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070451 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070492 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070576 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070589 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070622 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070654 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070693 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070728 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070756 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070787 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070815 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070845 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070873 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070900 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070928 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070955 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070981 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071008 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071058 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071089 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071116 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071153 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071180 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071203 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071228 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071253 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071277 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071303 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071328 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071355 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071386 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071414 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071453 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071481 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071506 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071535 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071558 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071582 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071607 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071630 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071655 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071677 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071701 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071728 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071753 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071784 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071811 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071836 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071858 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071902 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071929 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071959 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071985 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072009 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072036 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072081 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072104 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072126 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072159 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072184 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072212 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072236 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072262 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072292 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072318 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072346 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072373 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072402 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072430 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072459 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072485 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072510 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072536 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072561 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072586 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072609 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072636 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072664 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072688 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072710 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072733 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072759 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072786 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072812 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072836 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072862 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072891 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072914 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072938 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072962 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072987 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073012 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073058 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073084 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073112 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073138 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073166 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073192 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073221 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073247 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073276 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073301 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073324 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073350 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073380 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073505 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073534 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073558 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073583 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073611 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073637 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073664 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073689 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073717 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073743 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073806 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073837 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073863 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073889 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073915 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073944 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073969 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073997 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074025 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074138 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074165 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074192 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074218 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074243 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074267 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074318 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074349 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074378 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074406 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074433 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074458 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074486 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074513 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074540 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074567 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074597 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074627 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074655 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074684 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074708 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074772 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074813 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074849 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074885 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074932 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074985 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075026 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075075 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2300dfaf-ea26-4b33-8d8c-ab337aa56402-hosts-file\") pod \"node-resolver-q52w8\" (UID: \"2300dfaf-ea26-4b33-8d8c-ab337aa56402\") " pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075103 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlpfm\" (UniqueName: \"kubernetes.io/projected/2300dfaf-ea26-4b33-8d8c-ab337aa56402-kube-api-access-hlpfm\") pod \"node-resolver-q52w8\" (UID: \"2300dfaf-ea26-4b33-8d8c-ab337aa56402\") " pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075139 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075183 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070640 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075214 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070674 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.070850 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075257 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075292 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075324 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075352 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075516 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075538 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075554 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075571 4846 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075588 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075605 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075626 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075646 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075664 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075679 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075693 4846 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075707 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075722 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075738 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075753 4846 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.082003 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.085620 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071114 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071348 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071335 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071401 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071538 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071675 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.071909 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072032 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072029 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072222 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072201 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072453 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.072996 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073223 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073313 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073348 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073832 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.073695 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.074245 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075568 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.075599 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.076134 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.076233 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.076935 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077027 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077143 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077201 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077291 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077456 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077517 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077753 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077902 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.077980 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.078121 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.078186 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.078483 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.078497 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.078035 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.078557 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.079176 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.079734 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080205 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080407 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080478 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080453 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080726 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080841 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.080867 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081058 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081078 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081355 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081491 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081578 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081651 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.081723 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.082077 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.082244 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.082510 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.082749 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.083158 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.083461 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.083485 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.082743 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.083719 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.084146 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.084253 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.084293 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.084966 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.088081 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.088133 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:06.588106998 +0000 UTC m=+21.523796647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.087324 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.088473 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:06.588461128 +0000 UTC m=+21.524151017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.089061 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.089121 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.092535 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.092570 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.092653 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.092708 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.092909 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.093695 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.093891 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.094096 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.094430 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.095389 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.095557 4846 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.096345 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.096438 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.097032 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.097290 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.097312 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.097440 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.098507 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.098593 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:06.598563503 +0000 UTC m=+21.534253152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.098910 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099076 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099162 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099216 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099280 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099623 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099847 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.099900 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.100631 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.100872 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.100910 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.101015 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.100964 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.101059 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.101126 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.101120 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.101388 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.101769 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.101951 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.102028 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.102063 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.102173 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:14:06.602137634 +0000 UTC m=+21.537827403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.102207 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:06.602195455 +0000 UTC m=+21.537885324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.102697 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.102965 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.103100 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.103348 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.103342 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.103577 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.104107 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.104383 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.104399 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.104728 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105299 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105647 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105714 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105737 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105661 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105961 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.106730 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.106784 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.106806 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.107090 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.107142 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.107327 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.107622 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.107729 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.108888 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.108897 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.109088 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.109170 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.109168 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.109613 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.109672 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.109795 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.110012 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.110117 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.110406 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.111198 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.116427 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.116558 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.105139 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.118326 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.118779 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.119139 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.119619 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.119819 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.120768 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.121424 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.122390 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.124936 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.125159 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.125438 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.128111 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.129062 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.129378 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.131246 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.131345 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.131421 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.131636 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.131656 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.132095 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.132325 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.132378 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.133027 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.133152 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.133253 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.133327 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.133420 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.133973 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.134576 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.136407 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.139605 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.140147 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.141284 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.141506 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.141677 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.142104 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.146228 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.147839 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.152366 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.159550 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.160894 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.162105 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.162233 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.162377 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.161031 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"message\\\":\\\"W1122 09:13:49.429768 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 09:13:49.431173 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763802829 cert, and key in /tmp/serving-cert-3973349850/serving-signer.crt, /tmp/serving-cert-3973349850/serving-signer.key\\\\nI1122 09:13:49.815844 1 observer_polling.go:159] Starting file observer\\\\nW1122 09:13:49.818582 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 09:13:49.818771 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 09:13:49.820482 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3973349850/tls.crt::/tmp/serving-cert-3973349850/tls.key\\\\\\\"\\\\nF1122 09:13:50.053040 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.166299 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.168367 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.169763 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.171177 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176266 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176676 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176742 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176764 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2300dfaf-ea26-4b33-8d8c-ab337aa56402-hosts-file\") pod \"node-resolver-q52w8\" (UID: \"2300dfaf-ea26-4b33-8d8c-ab337aa56402\") " pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176783 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlpfm\" (UniqueName: \"kubernetes.io/projected/2300dfaf-ea26-4b33-8d8c-ab337aa56402-kube-api-access-hlpfm\") pod \"node-resolver-q52w8\" (UID: \"2300dfaf-ea26-4b33-8d8c-ab337aa56402\") " pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176836 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176849 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176860 4846 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176870 4846 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176880 4846 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176890 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176903 4846 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176921 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176938 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176950 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176963 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176976 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.176988 4846 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177000 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177013 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177033 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177068 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177082 4846 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177093 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177104 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177113 4846 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177123 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177133 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177143 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177156 4846 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177166 4846 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177176 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177186 4846 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177195 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177203 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177214 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177224 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177234 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177244 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177254 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177264 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177273 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177282 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177292 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177302 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177311 4846 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177321 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177331 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177340 4846 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177350 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177359 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177369 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177379 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177389 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177398 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177407 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177416 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177426 4846 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177436 4846 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177446 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177455 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177464 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177473 4846 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177483 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177493 4846 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177503 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177511 4846 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177540 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177549 4846 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177561 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177571 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177580 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177590 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177599 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177607 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177616 4846 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177626 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177636 4846 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177646 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177657 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177668 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177680 4846 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177691 4846 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177703 4846 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177713 4846 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177723 4846 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177735 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177745 4846 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177756 4846 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177764 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177773 4846 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177781 4846 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177790 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177800 4846 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177811 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177832 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177843 4846 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177864 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177875 4846 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177884 4846 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177894 4846 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177909 4846 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177925 4846 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177936 4846 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177947 4846 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177957 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177971 4846 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177982 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177991 4846 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178002 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.177989 4846 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4" exitCode=255 Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178015 4846 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178103 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178120 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178149 4846 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178160 4846 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178185 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178198 4846 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178210 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178219 4846 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178229 4846 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178238 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178247 4846 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178257 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178266 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178274 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178284 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178293 4846 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178302 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178312 4846 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178321 4846 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178361 4846 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178377 4846 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178393 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178404 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178416 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178460 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178470 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178481 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178491 4846 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178700 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178759 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4"} Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178822 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.178834 4846 scope.go:117] "RemoveContainer" containerID="0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.179032 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2300dfaf-ea26-4b33-8d8c-ab337aa56402-hosts-file\") pod \"node-resolver-q52w8\" (UID: \"2300dfaf-ea26-4b33-8d8c-ab337aa56402\") " pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.179648 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.179702 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180019 4846 scope.go:117] "RemoveContainer" containerID="8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180325 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180341 4846 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180352 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180360 4846 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180371 4846 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180380 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180389 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180397 4846 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180405 4846 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180414 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180425 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180433 4846 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180443 4846 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180452 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180460 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180469 4846 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180478 4846 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180487 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180495 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180507 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180517 4846 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180527 4846 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180536 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180546 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180555 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180564 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180597 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180608 4846 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180617 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180626 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180635 4846 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180643 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180653 4846 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180662 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180672 4846 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180681 4846 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180689 4846 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180698 4846 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180707 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180716 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180725 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180733 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.180742 4846 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.181360 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.186886 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.192906 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.200690 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlpfm\" (UniqueName: \"kubernetes.io/projected/2300dfaf-ea26-4b33-8d8c-ab337aa56402-kube-api-access-hlpfm\") pod \"node-resolver-q52w8\" (UID: \"2300dfaf-ea26-4b33-8d8c-ab337aa56402\") " pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.209811 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.213146 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hbcs8"] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.216170 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.219585 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.219598 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.219717 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.219761 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.220569 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.236588 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.260512 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"message\\\":\\\"W1122 09:13:49.429768 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 09:13:49.431173 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763802829 cert, and key in /tmp/serving-cert-3973349850/serving-signer.crt, /tmp/serving-cert-3973349850/serving-signer.key\\\\nI1122 09:13:49.815844 1 observer_polling.go:159] Starting file observer\\\\nW1122 09:13:49.818582 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 09:13:49.818771 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 09:13:49.820482 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3973349850/tls.crt::/tmp/serving-cert-3973349850/tls.key\\\\\\\"\\\\nF1122 09:13:50.053040 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.277527 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.283947 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.284172 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.292167 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.295512 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.297010 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q52w8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.344013 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.360428 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.382268 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.384979 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-multus-certs\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385036 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-cni-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385071 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-netns\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385113 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-system-cni-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385133 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9aec6a38-e6e4-4009-95d2-6a179c7fac04-cni-binary-copy\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385150 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-os-release\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385169 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-socket-dir-parent\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385201 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-hostroot\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385223 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-etc-kubernetes\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385248 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7zc\" (UniqueName: \"kubernetes.io/projected/9aec6a38-e6e4-4009-95d2-6a179c7fac04-kube-api-access-wm7zc\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385270 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-cni-bin\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385294 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-cni-multus\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385317 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-kubelet\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385338 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-daemon-config\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385581 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-cnibin\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385623 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-k8s-cni-cncf-io\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.385646 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-conf-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.396628 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.408522 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"message\\\":\\\"W1122 09:13:49.429768 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 09:13:49.431173 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763802829 cert, and key in /tmp/serving-cert-3973349850/serving-signer.crt, /tmp/serving-cert-3973349850/serving-signer.key\\\\nI1122 09:13:49.815844 1 observer_polling.go:159] Starting file observer\\\\nW1122 09:13:49.818582 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 09:13:49.818771 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 09:13:49.820482 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3973349850/tls.crt::/tmp/serving-cert-3973349850/tls.key\\\\\\\"\\\\nF1122 09:13:50.053040 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.430366 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.454992 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.468633 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.485144 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486721 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-system-cni-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486768 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9aec6a38-e6e4-4009-95d2-6a179c7fac04-cni-binary-copy\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486791 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-socket-dir-parent\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486814 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-os-release\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486841 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-hostroot\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486862 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-etc-kubernetes\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486886 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7zc\" (UniqueName: \"kubernetes.io/projected/9aec6a38-e6e4-4009-95d2-6a179c7fac04-kube-api-access-wm7zc\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486910 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-cni-multus\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486932 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-kubelet\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486930 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-socket-dir-parent\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486954 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-daemon-config\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486975 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-cni-bin\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.486993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-k8s-cni-cncf-io\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487011 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-conf-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487012 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-kubelet\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487028 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-cnibin\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487070 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-multus-certs\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487102 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-cni-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487110 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-hostroot\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487124 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-netns\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487204 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-netns\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487222 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-etc-kubernetes\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487256 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-cni-bin\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487282 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-k8s-cni-cncf-io\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487294 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-os-release\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487314 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-conf-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487325 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-run-multus-certs\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487073 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-host-var-lib-cni-multus\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487369 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-cnibin\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487383 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-cni-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487399 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9aec6a38-e6e4-4009-95d2-6a179c7fac04-system-cni-dir\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.487737 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9aec6a38-e6e4-4009-95d2-6a179c7fac04-multus-daemon-config\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.488242 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9aec6a38-e6e4-4009-95d2-6a179c7fac04-cni-binary-copy\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.499536 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.515248 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7zc\" (UniqueName: \"kubernetes.io/projected/9aec6a38-e6e4-4009-95d2-6a179c7fac04-kube-api-access-wm7zc\") pod \"multus-hbcs8\" (UID: \"9aec6a38-e6e4-4009-95d2-6a179c7fac04\") " pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.520311 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.530708 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hbcs8" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.536831 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.588275 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.588447 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.588514 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:07.588493852 +0000 UTC m=+22.524183511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.600262 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-c59mw"] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.600779 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4h26m"] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.601144 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.601213 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kws67"] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.601947 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.602493 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.604857 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.605395 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606112 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606167 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606330 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606421 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606467 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606632 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.606700 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.607850 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.611837 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.612190 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.612428 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.612447 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.618297 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.630880 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.640962 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.655355 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.665383 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.676165 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.686546 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.688879 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.689013 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:14:07.688989628 +0000 UTC m=+22.624679287 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689077 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-system-cni-dir\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689118 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86a01cc5-5438-4978-8919-2d24f665922a-mcd-auth-proxy-config\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689165 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689192 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cni-binary-copy\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689275 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-kubelet\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689304 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-netns\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689327 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-ovn-kubernetes\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689413 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxjb\" (UniqueName: \"kubernetes.io/projected/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-kube-api-access-pnxjb\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689473 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.689643 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.689667 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.689682 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.689733 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:07.689722779 +0000 UTC m=+22.625412428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.689508 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts2fd\" (UniqueName: \"kubernetes.io/projected/86a01cc5-5438-4978-8919-2d24f665922a-kube-api-access-ts2fd\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690119 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-bin\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690153 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-log-socket\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690180 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-netd\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690208 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-systemd\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690309 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-var-lib-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690383 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86a01cc5-5438-4978-8919-2d24f665922a-proxy-tls\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690409 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scw9m\" (UniqueName: \"kubernetes.io/projected/c874da16-5eda-477e-bbd5-e5c105dc7a07-kube-api-access-scw9m\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690435 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690457 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cnibin\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690482 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-node-log\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690509 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/86a01cc5-5438-4978-8919-2d24f665922a-rootfs\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690532 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-ovn\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690664 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-slash\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690714 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-etc-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690746 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690775 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-config\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690805 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-os-release\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690852 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690887 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-systemd-units\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690913 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-script-lib\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.690947 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.690960 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.691008 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:07.690993465 +0000 UTC m=+22.626683334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.691053 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.691114 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.691123 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-env-overrides\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.691179 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovn-node-metrics-cert\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.691142 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.691220 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: E1122 09:14:06.691262 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:07.691252012 +0000 UTC m=+22.626941861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.699884 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"message\\\":\\\"W1122 09:13:49.429768 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 09:13:49.431173 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763802829 cert, and key in /tmp/serving-cert-3973349850/serving-signer.crt, /tmp/serving-cert-3973349850/serving-signer.key\\\\nI1122 09:13:49.815844 1 observer_polling.go:159] Starting file observer\\\\nW1122 09:13:49.818582 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 09:13:49.818771 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 09:13:49.820482 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3973349850/tls.crt::/tmp/serving-cert-3973349850/tls.key\\\\\\\"\\\\nF1122 09:13:50.053040 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.704390 4846 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.712522 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.724491 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.744996 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"message\\\":\\\"W1122 09:13:49.429768 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 09:13:49.431173 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763802829 cert, and key in /tmp/serving-cert-3973349850/serving-signer.crt, /tmp/serving-cert-3973349850/serving-signer.key\\\\nI1122 09:13:49.815844 1 observer_polling.go:159] Starting file observer\\\\nW1122 09:13:49.818582 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 09:13:49.818771 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 09:13:49.820482 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3973349850/tls.crt::/tmp/serving-cert-3973349850/tls.key\\\\\\\"\\\\nF1122 09:13:50.053040 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.761711 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.774096 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.786469 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792209 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-systemd\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792243 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-log-socket\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792264 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-netd\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792280 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-var-lib-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792299 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792318 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86a01cc5-5438-4978-8919-2d24f665922a-proxy-tls\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792336 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scw9m\" (UniqueName: \"kubernetes.io/projected/c874da16-5eda-477e-bbd5-e5c105dc7a07-kube-api-access-scw9m\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792352 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cnibin\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792359 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-netd\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792366 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-node-log\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792392 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-systemd\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792415 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/86a01cc5-5438-4978-8919-2d24f665922a-rootfs\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792432 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/86a01cc5-5438-4978-8919-2d24f665922a-rootfs\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792393 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-node-log\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792461 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-log-socket\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792516 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-var-lib-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792613 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cnibin\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792644 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-ovn\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792681 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-os-release\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792704 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-slash\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792710 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-ovn\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792725 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-etc-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792695 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792747 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-slash\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792748 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792783 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792790 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-etc-openvswitch\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792754 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-os-release\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792788 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-config\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792944 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-systemd-units\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792968 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-script-lib\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.792989 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793023 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-env-overrides\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793061 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovn-node-metrics-cert\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793085 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-system-cni-dir\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793108 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86a01cc5-5438-4978-8919-2d24f665922a-mcd-auth-proxy-config\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793125 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793145 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cni-binary-copy\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793160 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-kubelet\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793183 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-netns\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793201 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-ovn-kubernetes\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793241 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxjb\" (UniqueName: \"kubernetes.io/projected/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-kube-api-access-pnxjb\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793260 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-bin\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793281 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts2fd\" (UniqueName: \"kubernetes.io/projected/86a01cc5-5438-4978-8919-2d24f665922a-kube-api-access-ts2fd\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793630 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-config\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793925 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-script-lib\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.793986 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-systemd-units\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794025 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-system-cni-dir\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794074 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794123 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-kubelet\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794149 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-netns\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794173 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-ovn-kubernetes\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794212 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-cni-binary-copy\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794319 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-bin\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794373 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794565 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-env-overrides\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.794894 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/86a01cc5-5438-4978-8919-2d24f665922a-mcd-auth-proxy-config\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.796537 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86a01cc5-5438-4978-8919-2d24f665922a-proxy-tls\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.797811 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.797828 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovn-node-metrics-cert\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.806862 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.809112 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts2fd\" (UniqueName: \"kubernetes.io/projected/86a01cc5-5438-4978-8919-2d24f665922a-kube-api-access-ts2fd\") pod \"machine-config-daemon-c59mw\" (UID: \"86a01cc5-5438-4978-8919-2d24f665922a\") " pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.810625 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxjb\" (UniqueName: \"kubernetes.io/projected/18c1f212-c2f8-4c90-bd30-57ed4dc2fc84-kube-api-access-pnxjb\") pod \"multus-additional-cni-plugins-4h26m\" (UID: \"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\") " pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.811548 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scw9m\" (UniqueName: \"kubernetes.io/projected/c874da16-5eda-477e-bbd5-e5c105dc7a07-kube-api-access-scw9m\") pod \"ovnkube-node-kws67\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.816649 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.825739 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.834740 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.845320 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.868187 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.914987 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.915280 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.926557 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:06 crc kubenswrapper[4846]: W1122 09:14:06.928859 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a01cc5_5438_4978_8919_2d24f665922a.slice/crio-7c1fa64866cea307a7868a04713df9b80401fe6e1dde4e1a1ed02a1327c20909 WatchSource:0}: Error finding container 7c1fa64866cea307a7868a04713df9b80401fe6e1dde4e1a1ed02a1327c20909: Status 404 returned error can't find the container with id 7c1fa64866cea307a7868a04713df9b80401fe6e1dde4e1a1ed02a1327c20909 Nov 22 09:14:06 crc kubenswrapper[4846]: I1122 09:14:06.932022 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4h26m" Nov 22 09:14:06 crc kubenswrapper[4846]: W1122 09:14:06.943297 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc874da16_5eda_477e_bbd5_e5c105dc7a07.slice/crio-62735a5b6b9a38a4c4d8acd7e31b376cf49305316f3d97d331d51174828f3cf6 WatchSource:0}: Error finding container 62735a5b6b9a38a4c4d8acd7e31b376cf49305316f3d97d331d51174828f3cf6: Status 404 returned error can't find the container with id 62735a5b6b9a38a4c4d8acd7e31b376cf49305316f3d97d331d51174828f3cf6 Nov 22 09:14:06 crc kubenswrapper[4846]: W1122 09:14:06.945675 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c1f212_c2f8_4c90_bd30_57ed4dc2fc84.slice/crio-fac285a056f730acbe6995289e8a31f317dea77aeb54ca82a331dac4871698ae WatchSource:0}: Error finding container fac285a056f730acbe6995289e8a31f317dea77aeb54ca82a331dac4871698ae: Status 404 returned error can't find the container with id fac285a056f730acbe6995289e8a31f317dea77aeb54ca82a331dac4871698ae Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.182556 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d5bc68a2bbf3ae703ef8b22344327ea6abedbe2fc1a319b7f6544d4381c3d644"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.183891 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerStarted","Data":"fac285a056f730acbe6995289e8a31f317dea77aeb54ca82a331dac4871698ae"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.192364 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerStarted","Data":"0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.192439 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerStarted","Data":"4114d07b4bfc8bfacddcc55ff852b481b81cd8ea8a260f37ec7f98a67b7b3e17"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.217409 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q52w8" event={"ID":"2300dfaf-ea26-4b33-8d8c-ab337aa56402","Type":"ContainerStarted","Data":"1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.217476 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q52w8" event={"ID":"2300dfaf-ea26-4b33-8d8c-ab337aa56402","Type":"ContainerStarted","Data":"30f771a11d8033742697abb91eeda3dc2085d97255fd8fed9784e49405950c4e"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.219048 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.219121 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.219136 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a1475189448ccc5f4ffc88b0a26ecc02e1bc70226ed78621d9d9298dfcc100c1"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.236424 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.248283 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.248340 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"7c1fa64866cea307a7868a04713df9b80401fe6e1dde4e1a1ed02a1327c20909"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.255717 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.255776 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"35bff60372b315076d12f6d705510d9b8e67f7f6f3154ac539c8979ef49d286b"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.267988 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.276625 4846 scope.go:117] "RemoveContainer" containerID="8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4" Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.276792 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.279997 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.281833 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" exitCode=0 Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.281876 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.281904 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"62735a5b6b9a38a4c4d8acd7e31b376cf49305316f3d97d331d51174828f3cf6"} Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.308566 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.328504 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.343273 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0749c30adac23b20dee5a196dcbc627333d63cc707e947109a792cad91ed8cd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"message\\\":\\\"W1122 09:13:49.429768 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 09:13:49.431173 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763802829 cert, and key in /tmp/serving-cert-3973349850/serving-signer.crt, /tmp/serving-cert-3973349850/serving-signer.key\\\\nI1122 09:13:49.815844 1 observer_polling.go:159] Starting file observer\\\\nW1122 09:13:49.818582 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 09:13:49.818771 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 09:13:49.820482 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3973349850/tls.crt::/tmp/serving-cert-3973349850/tls.key\\\\\\\"\\\\nF1122 09:13:50.053040 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.361997 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.377320 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.393188 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.408855 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.422430 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.438455 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.451840 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.471272 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.486521 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.513204 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.556209 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.592609 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.601038 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.601196 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.601280 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:09.601261107 +0000 UTC m=+24.536950766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.633322 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.671413 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.701686 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.701828 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:14:09.701803715 +0000 UTC m=+24.637493364 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.701860 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.701889 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.701923 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702018 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702027 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702070 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702083 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:09.702073982 +0000 UTC m=+24.637763631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702086 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702115 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:09.702108993 +0000 UTC m=+24.637798642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702027 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702135 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702143 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:07 crc kubenswrapper[4846]: E1122 09:14:07.702162 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:09.702156665 +0000 UTC m=+24.637846314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.719691 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.749400 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.792135 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.830817 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:07 crc kubenswrapper[4846]: I1122 09:14:07.875515 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:07Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.034691 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.034732 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.034790 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:08 crc kubenswrapper[4846]: E1122 09:14:08.034855 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:08 crc kubenswrapper[4846]: E1122 09:14:08.034993 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:08 crc kubenswrapper[4846]: E1122 09:14:08.035085 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.039108 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.039759 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.040785 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.041617 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.042382 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.043008 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.043802 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.044586 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.047084 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.047984 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.048695 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.049641 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.050519 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.051404 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.052268 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.052927 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.053701 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.054195 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.054804 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.055672 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.056303 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.058570 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.059291 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.060068 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.060673 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.061310 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.061993 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.062573 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.063189 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.063669 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.064162 4846 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.064261 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.065662 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.066201 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.066630 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.067810 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.071298 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.071888 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.073102 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.073740 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.074605 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.075320 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.076284 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.076964 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.077978 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.078629 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.079644 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.080347 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.081204 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.081749 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.082622 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.083282 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.083857 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.084684 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.290567 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.290646 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.290658 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.290667 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.292107 4846 generic.go:334] "Generic (PLEG): container finished" podID="18c1f212-c2f8-4c90-bd30-57ed4dc2fc84" containerID="5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7" exitCode=0 Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.292154 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerDied","Data":"5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7"} Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.294806 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c"} Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.305170 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.316914 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.330809 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.354440 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.373300 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.390814 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.408983 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.429141 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.449359 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.463657 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.466787 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-grx77"] Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.467275 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.469313 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.469677 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.469772 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.469839 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.477320 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.493474 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.507338 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppnm5\" (UniqueName: \"kubernetes.io/projected/8814a472-d38b-4083-9294-d48a525987c4-kube-api-access-ppnm5\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.507414 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8814a472-d38b-4083-9294-d48a525987c4-host\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.507444 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8814a472-d38b-4083-9294-d48a525987c4-serviceca\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.508491 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.525568 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.552522 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.598463 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.608390 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8814a472-d38b-4083-9294-d48a525987c4-host\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.608432 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8814a472-d38b-4083-9294-d48a525987c4-serviceca\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.608471 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppnm5\" (UniqueName: \"kubernetes.io/projected/8814a472-d38b-4083-9294-d48a525987c4-kube-api-access-ppnm5\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.608555 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8814a472-d38b-4083-9294-d48a525987c4-host\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.609443 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8814a472-d38b-4083-9294-d48a525987c4-serviceca\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.629785 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.658282 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppnm5\" (UniqueName: \"kubernetes.io/projected/8814a472-d38b-4083-9294-d48a525987c4-kube-api-access-ppnm5\") pod \"node-ca-grx77\" (UID: \"8814a472-d38b-4083-9294-d48a525987c4\") " pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.691880 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.734345 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.771638 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.788136 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grx77" Nov 22 09:14:08 crc kubenswrapper[4846]: W1122 09:14:08.800358 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8814a472_d38b_4083_9294_d48a525987c4.slice/crio-05b54fefe4c0995116d193f71cf8bb9a3d31ee2dbb9206f9e9c9fc9a3862c495 WatchSource:0}: Error finding container 05b54fefe4c0995116d193f71cf8bb9a3d31ee2dbb9206f9e9c9fc9a3862c495: Status 404 returned error can't find the container with id 05b54fefe4c0995116d193f71cf8bb9a3d31ee2dbb9206f9e9c9fc9a3862c495 Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.814029 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.852129 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.889847 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.930079 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:08 crc kubenswrapper[4846]: I1122 09:14:08.969026 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:08Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.165674 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.166834 4846 scope.go:117] "RemoveContainer" containerID="8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4" Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.167027 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.300380 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerStarted","Data":"7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f"} Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.301736 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grx77" event={"ID":"8814a472-d38b-4083-9294-d48a525987c4","Type":"ContainerStarted","Data":"c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7"} Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.301787 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grx77" event={"ID":"8814a472-d38b-4083-9294-d48a525987c4","Type":"ContainerStarted","Data":"05b54fefe4c0995116d193f71cf8bb9a3d31ee2dbb9206f9e9c9fc9a3862c495"} Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.305591 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.305640 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.306978 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b"} Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.316660 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.329136 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.340534 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.352177 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.366941 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.378065 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.401581 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.414132 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.431136 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.445092 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.458731 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.472618 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.492416 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.531757 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.574491 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.613507 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.618025 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.618280 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.618428 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:13.618397472 +0000 UTC m=+28.554087161 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.653955 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.689331 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.719081 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.719200 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.719225 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719276 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:14:13.719246499 +0000 UTC m=+28.654936148 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.719327 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719338 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719390 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:13.719375732 +0000 UTC m=+28.655065381 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719455 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719495 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719511 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719588 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:13.719566498 +0000 UTC m=+28.655256157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719626 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719675 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719692 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:09 crc kubenswrapper[4846]: E1122 09:14:09.719776 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:13.719748733 +0000 UTC m=+28.655438382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.730323 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.770744 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.825536 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.855678 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.892691 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.930498 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:09 crc kubenswrapper[4846]: I1122 09:14:09.971291 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:09Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.016081 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.026231 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.026958 4846 scope.go:117] "RemoveContainer" containerID="8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4" Nov 22 09:14:10 crc kubenswrapper[4846]: E1122 09:14:10.027184 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.035093 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.035093 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:10 crc kubenswrapper[4846]: E1122 09:14:10.035277 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.035119 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:10 crc kubenswrapper[4846]: E1122 09:14:10.035309 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:10 crc kubenswrapper[4846]: E1122 09:14:10.035373 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.313123 4846 generic.go:334] "Generic (PLEG): container finished" podID="18c1f212-c2f8-4c90-bd30-57ed4dc2fc84" containerID="7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f" exitCode=0 Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.313219 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerDied","Data":"7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f"} Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.345981 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.361944 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.377449 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.394388 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.408217 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.424585 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.440735 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.458844 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.472229 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.488097 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.501675 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.526495 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.541420 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.570796 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.608827 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.653020 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.691293 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.745926 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.773799 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.812454 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.854266 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.895590 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.933870 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:10 crc kubenswrapper[4846]: I1122 09:14:10.970810 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:10Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.012119 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.049686 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.318369 4846 generic.go:334] "Generic (PLEG): container finished" podID="18c1f212-c2f8-4c90-bd30-57ed4dc2fc84" containerID="026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257" exitCode=0 Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.318579 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerDied","Data":"026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257"} Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.332203 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.334242 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.340955 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.344485 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.354396 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.376626 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.396693 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.413774 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.426172 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.440974 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.454383 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.473075 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.492935 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.512722 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.551927 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.591580 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.631810 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.676187 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.713983 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.754019 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.791292 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.837981 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.872539 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.913737 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.952800 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:11 crc kubenswrapper[4846]: I1122 09:14:11.991623 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:11Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.031274 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.034413 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.034421 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.034536 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.034607 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.034721 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.034828 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.067663 4846 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.068742 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.070393 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.070445 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.070480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.070681 4846 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.131304 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.145331 4846 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.145738 4846 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.147103 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.147165 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.147178 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.147197 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.147207 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.163997 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.168138 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.168202 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.168215 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.168238 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.168252 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.180822 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.184886 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.184924 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.184937 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.184959 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.184971 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.192650 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.198288 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.202885 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.202930 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.202948 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.202976 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.202995 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.217669 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.221235 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.221307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.221322 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.221354 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.221383 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.234187 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: E1122 09:14:12.234342 4846 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.236262 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.236288 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.236297 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.236311 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.236321 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.324134 4846 generic.go:334] "Generic (PLEG): container finished" podID="18c1f212-c2f8-4c90-bd30-57ed4dc2fc84" containerID="e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad" exitCode=0 Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.324207 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerDied","Data":"e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.330425 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.339498 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.339530 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.339539 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.339555 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.339565 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.341602 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.358326 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.371568 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.385496 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.399908 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.439672 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.446271 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.446311 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.446321 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.446346 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.446356 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.471890 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.513267 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.548908 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.549370 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.549383 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.549405 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.549419 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.551058 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.592671 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.630623 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.651538 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.651595 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.651612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.651631 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.651644 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.671455 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.709614 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.749601 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:12Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.754466 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.754500 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.754514 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.754532 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.754544 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.857110 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.857155 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.857170 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.857188 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.857200 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.959955 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.959996 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.960007 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.960023 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:12 crc kubenswrapper[4846]: I1122 09:14:12.960034 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:12Z","lastTransitionTime":"2025-11-22T09:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.062470 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.062525 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.062538 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.062557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.062573 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.165095 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.165148 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.165159 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.165173 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.165182 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.267822 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.267856 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.267865 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.267895 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.267906 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.337993 4846 generic.go:334] "Generic (PLEG): container finished" podID="18c1f212-c2f8-4c90-bd30-57ed4dc2fc84" containerID="5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b" exitCode=0 Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.338037 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerDied","Data":"5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.360187 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.371459 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.371531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.371550 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.371581 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.371600 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.391458 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.425817 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.443949 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.471608 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.477207 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.477251 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.477263 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.477279 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.477292 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.485323 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.501142 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.518587 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.537638 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.554202 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.570708 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.579805 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.579854 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.579864 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.579880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.579896 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.588998 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.603123 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.614414 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:13Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.663734 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.663881 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.663969 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:21.6639446 +0000 UTC m=+36.599634249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.682736 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.682788 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.682801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.682820 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.682833 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.765007 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.765161 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.765194 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.765245 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765412 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765401 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:14:21.765355372 +0000 UTC m=+36.701045021 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765436 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765474 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765483 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765498 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765513 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765542 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:21.765527127 +0000 UTC m=+36.701216776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765573 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:21.765552778 +0000 UTC m=+36.701242617 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765635 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: E1122 09:14:13.765663 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:21.765655061 +0000 UTC m=+36.701344940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.785445 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.785482 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.785491 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.785506 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.785516 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.889180 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.889238 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.889255 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.889281 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.889300 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.991697 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.991742 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.991754 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.991775 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:13 crc kubenswrapper[4846]: I1122 09:14:13.991788 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:13Z","lastTransitionTime":"2025-11-22T09:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.034534 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.034613 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.034722 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:14 crc kubenswrapper[4846]: E1122 09:14:14.035100 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:14 crc kubenswrapper[4846]: E1122 09:14:14.035185 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:14 crc kubenswrapper[4846]: E1122 09:14:14.035300 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.094461 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.094500 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.094510 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.094528 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.094538 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.197106 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.197203 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.197229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.197264 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.197288 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.300252 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.300307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.300323 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.300347 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.300361 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.346608 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.347025 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.350657 4846 generic.go:334] "Generic (PLEG): container finished" podID="18c1f212-c2f8-4c90-bd30-57ed4dc2fc84" containerID="48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417" exitCode=0 Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.350713 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerDied","Data":"48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.360584 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.372951 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.380204 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.386330 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.403491 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.403544 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.403557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.403576 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.403591 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.404494 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.417063 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.429415 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.442083 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.453835 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.467765 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.481880 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.494674 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.506223 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.506265 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.506276 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.506295 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.506309 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.510155 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.527189 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.542626 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.554256 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.563764 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.579503 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.598475 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.609906 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.609950 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.609963 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.609983 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.609992 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.611469 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.622650 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.634603 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.647178 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.660768 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.674814 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.687190 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.698371 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.712619 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.717834 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.717860 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.717872 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.717888 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.717900 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.728002 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:14Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.819947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.819978 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.819987 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.820003 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.820013 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.922134 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.922172 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.922182 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.922198 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:14 crc kubenswrapper[4846]: I1122 09:14:14.922209 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:14Z","lastTransitionTime":"2025-11-22T09:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.025282 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.025322 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.025333 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.025353 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.025365 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.127907 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.127943 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.127952 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.127966 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.127975 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.230268 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.230316 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.230325 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.230357 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.230369 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.333173 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.333249 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.333260 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.333278 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.333291 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.358008 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" event={"ID":"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84","Type":"ContainerStarted","Data":"0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.358068 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.358399 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.373179 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.386744 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.398127 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.415767 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.422564 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.430216 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.438768 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.438814 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.438825 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.438846 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.438862 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.455643 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.478641 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.494133 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.513409 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.533672 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.541799 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.541830 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.541840 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.541856 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.541869 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.549744 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.564756 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.576636 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.586096 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.598035 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.609715 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.621415 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.632947 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.644370 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.644420 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.644433 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.644453 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.644465 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.644592 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.656529 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.670932 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.684721 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.707513 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.727168 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.742796 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.747297 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.747377 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.747405 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.747433 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.747453 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.762874 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.782721 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.801335 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:15Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.850407 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.850461 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.850471 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.850489 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.850500 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.953437 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.953499 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.953512 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.953532 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:15 crc kubenswrapper[4846]: I1122 09:14:15.953542 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:15Z","lastTransitionTime":"2025-11-22T09:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.035028 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.035116 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:16 crc kubenswrapper[4846]: E1122 09:14:16.035230 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.035307 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:16 crc kubenswrapper[4846]: E1122 09:14:16.035528 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:16 crc kubenswrapper[4846]: E1122 09:14:16.035721 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.051476 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.056591 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.056647 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.056664 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.056687 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.056701 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.072695 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.089101 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.120438 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.132935 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.146427 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.164503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.164559 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.164574 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.164598 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.164613 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.165119 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.180739 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.192301 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.205503 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.218668 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.232987 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.246975 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.267243 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.267296 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.267307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.267329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.267341 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.269301 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:16Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.361314 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.370377 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.370419 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.370429 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.370448 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.370459 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.473405 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.473440 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.473449 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.473464 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.473474 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.576131 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.576188 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.576230 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.576253 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.576266 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.678710 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.678772 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.678787 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.678812 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.678834 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.781799 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.781851 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.781885 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.781904 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.781913 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.885257 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.885303 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.885316 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.885333 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.885345 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.987974 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.988010 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.988019 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.988037 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:16 crc kubenswrapper[4846]: I1122 09:14:16.988083 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:16Z","lastTransitionTime":"2025-11-22T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.091951 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.092023 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.092057 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.092080 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.092104 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.194403 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.194443 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.194453 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.194469 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.194478 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.297813 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.297872 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.297890 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.297914 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.297930 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.365842 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.401772 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.401846 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.401871 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.401901 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.401923 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.504306 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.504348 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.504357 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.504373 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.504385 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.606808 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.606851 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.606865 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.606884 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.606906 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.710006 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.710078 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.710090 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.710110 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.710120 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.812564 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.812607 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.812620 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.812638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.812651 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.914596 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.914633 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.914642 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.914662 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:17 crc kubenswrapper[4846]: I1122 09:14:17.914681 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:17Z","lastTransitionTime":"2025-11-22T09:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.017354 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.017407 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.017420 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.017441 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.017454 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.034696 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.034752 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.034696 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:18 crc kubenswrapper[4846]: E1122 09:14:18.034849 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:18 crc kubenswrapper[4846]: E1122 09:14:18.034913 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:18 crc kubenswrapper[4846]: E1122 09:14:18.034989 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.120362 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.120409 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.120421 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.120438 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.120448 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.223307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.223361 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.223372 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.223388 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.223400 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.325791 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.325831 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.325839 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.325854 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.325867 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.428524 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.428568 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.428580 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.428599 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.428613 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.440558 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc"] Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.441408 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.443212 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.444971 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.458718 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.473389 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.485742 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.501653 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.513111 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11b0a86f-726c-4264-86f0-3691daeebe8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.513145 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11b0a86f-726c-4264-86f0-3691daeebe8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.513196 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11b0a86f-726c-4264-86f0-3691daeebe8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.513249 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5x2z\" (UniqueName: \"kubernetes.io/projected/11b0a86f-726c-4264-86f0-3691daeebe8b-kube-api-access-g5x2z\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.515699 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.531731 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.531784 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.531794 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.531818 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.531832 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.541131 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.558433 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.572653 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.591918 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.609598 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.614114 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11b0a86f-726c-4264-86f0-3691daeebe8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.614181 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11b0a86f-726c-4264-86f0-3691daeebe8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.614236 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5x2z\" (UniqueName: \"kubernetes.io/projected/11b0a86f-726c-4264-86f0-3691daeebe8b-kube-api-access-g5x2z\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.614262 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11b0a86f-726c-4264-86f0-3691daeebe8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.615066 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11b0a86f-726c-4264-86f0-3691daeebe8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.615117 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11b0a86f-726c-4264-86f0-3691daeebe8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.620955 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11b0a86f-726c-4264-86f0-3691daeebe8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.626645 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.634305 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.634365 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.634384 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.634408 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.634424 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.635860 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5x2z\" (UniqueName: \"kubernetes.io/projected/11b0a86f-726c-4264-86f0-3691daeebe8b-kube-api-access-g5x2z\") pod \"ovnkube-control-plane-749d76644c-twrxc\" (UID: \"11b0a86f-726c-4264-86f0-3691daeebe8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.643988 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.657595 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.677858 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.689627 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:18Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.737503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.737620 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.737631 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.737653 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.737664 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.755240 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.841013 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.841395 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.841409 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.841429 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.841439 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.943812 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.943870 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.943888 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.943910 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:18 crc kubenswrapper[4846]: I1122 09:14:18.943922 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:18Z","lastTransitionTime":"2025-11-22T09:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.046828 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.046880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.046895 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.046916 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.046927 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.149790 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.149833 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.149843 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.149882 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.149891 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.253123 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.253191 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.253211 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.253238 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.253259 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.355517 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.355562 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.355573 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.355590 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.355602 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.373722 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" event={"ID":"11b0a86f-726c-4264-86f0-3691daeebe8b","Type":"ContainerStarted","Data":"a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.373794 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" event={"ID":"11b0a86f-726c-4264-86f0-3691daeebe8b","Type":"ContainerStarted","Data":"05597abde73015be6aab5b22d13400a956affe0922076d6257a3bfbc51d4d11e"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.376919 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/0.log" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.379792 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b" exitCode=1 Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.379839 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.380646 4846 scope.go:117] "RemoveContainer" containerID="dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.398378 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.413597 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.429198 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.446542 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.458728 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.458808 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.458824 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.458843 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.458861 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.460462 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.472837 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.483440 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.491839 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.505535 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.523794 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.537345 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.551174 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.561857 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.561902 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.561912 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.561928 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.561940 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.565790 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.583989 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.596947 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.667123 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.667167 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.667176 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.667193 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.667203 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.770520 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.770559 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.770567 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.770583 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.770593 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.873995 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.874526 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.874638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.874751 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.874888 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.915534 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-79xpm"] Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.916489 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:19 crc kubenswrapper[4846]: E1122 09:14:19.916645 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.930673 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.943260 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.958563 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.976094 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.978389 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.978450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.978467 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.978488 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.978499 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:19Z","lastTransitionTime":"2025-11-22T09:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:19 crc kubenswrapper[4846]: I1122 09:14:19.992795 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:19Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.006640 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.022330 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.029375 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnps2\" (UniqueName: \"kubernetes.io/projected/e79bf3c4-87ae-4009-9a11-d26130912fef-kube-api-access-qnps2\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.029453 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.035063 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.035090 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.035089 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.035210 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.035561 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.035446 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.039435 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.054083 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.067472 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.078610 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.080675 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.080697 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.080706 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.080721 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.080732 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.091905 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.108318 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.125082 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.130450 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnps2\" (UniqueName: \"kubernetes.io/projected/e79bf3c4-87ae-4009-9a11-d26130912fef-kube-api-access-qnps2\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.130524 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.130671 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.130733 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:14:20.630715948 +0000 UTC m=+35.566405597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.149244 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.152948 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnps2\" (UniqueName: \"kubernetes.io/projected/e79bf3c4-87ae-4009-9a11-d26130912fef-kube-api-access-qnps2\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.166206 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.184037 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.184106 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.184119 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.184138 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.184150 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.287714 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.287799 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.287815 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.287838 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.287852 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.386710 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/0.log" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.389556 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.389611 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.389624 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.389643 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.389653 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.391810 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.391966 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.394499 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" event={"ID":"11b0a86f-726c-4264-86f0-3691daeebe8b","Type":"ContainerStarted","Data":"956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.412900 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.429800 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.461327 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.481991 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.492991 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.493061 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.493074 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.493096 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.493109 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.539482 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.558184 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.592777 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.596788 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.596822 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.596833 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.596850 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.596861 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.622826 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.636170 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.636332 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:20 crc kubenswrapper[4846]: E1122 09:14:20.636401 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:14:21.636382821 +0000 UTC m=+36.572072470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.653468 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.668665 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.680452 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.691871 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.699766 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.699977 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.700055 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.700128 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.700190 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.707387 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.723329 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.735686 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.748154 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.761361 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.774217 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.798578 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.802452 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.802495 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.802506 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.802521 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.802532 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.811010 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.825438 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.839261 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.852696 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.870111 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.884954 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.898465 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.905367 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.905566 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.905630 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.905692 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.905749 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:20Z","lastTransitionTime":"2025-11-22T09:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.912576 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.926757 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.943283 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.957645 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.971243 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:20 crc kubenswrapper[4846]: I1122 09:14:20.983419 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:20Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.009129 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.009177 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.009186 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.009203 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.009214 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.111874 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.111915 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.111927 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.111950 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.111965 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.215626 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.215679 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.215693 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.215712 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.215728 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.319346 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.319429 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.319447 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.319474 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.319492 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.400283 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/1.log" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.401395 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/0.log" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.405162 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065" exitCode=1 Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.405256 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.405323 4846 scope.go:117] "RemoveContainer" containerID="dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.406653 4846 scope.go:117] "RemoveContainer" containerID="8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065" Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.406900 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.422472 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.422518 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.422531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.422551 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.422566 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.426885 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.445615 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.462631 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.479039 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.492723 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.507218 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.520951 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.524633 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.524666 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.524675 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.524690 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.524701 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.533230 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.543890 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.556355 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.567817 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.577926 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.589904 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.601258 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.621608 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.627397 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.627436 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.627449 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.627468 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.627480 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.637834 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:21Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.647369 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.647536 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.647601 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:14:23.647579692 +0000 UTC m=+38.583269351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.730732 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.730787 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.730800 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.730819 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.730843 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.748579 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.748728 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.748796 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:37.748779399 +0000 UTC m=+52.684469048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.834281 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.834337 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.834348 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.834367 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.834381 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.850110 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850271 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:14:37.850241953 +0000 UTC m=+52.785931602 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.850331 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.850377 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.850457 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850522 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850549 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850609 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850680 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:37.850663014 +0000 UTC m=+52.786352663 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850680 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850695 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850784 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:37.850760217 +0000 UTC m=+52.786449946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850801 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850817 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:21 crc kubenswrapper[4846]: E1122 09:14:21.850858 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:14:37.85084876 +0000 UTC m=+52.786538409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.937928 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.937997 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.938015 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.938077 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:21 crc kubenswrapper[4846]: I1122 09:14:21.938101 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:21Z","lastTransitionTime":"2025-11-22T09:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.034741 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.034985 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.035401 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.036193 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.036266 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.036354 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.036471 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.036951 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.037185 4846 scope.go:117] "RemoveContainer" containerID="8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.041638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.041660 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.041670 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.041688 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.041699 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.144025 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.144085 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.144095 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.144114 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.144129 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.246852 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.246903 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.246915 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.246934 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.246945 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.350809 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.350854 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.350863 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.350880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.350891 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.411385 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/1.log" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.416819 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.418454 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.418918 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.438363 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.448723 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.448768 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.448780 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.448801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.448813 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.459150 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.470244 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.476415 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.476483 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.476494 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.476511 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.476523 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.477589 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.488246 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.492837 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.492866 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.492878 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.492899 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.492909 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.498149 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.507234 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.512504 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.512541 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.512551 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.512573 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.512584 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.513389 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.527966 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.536943 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.541149 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.541204 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.541185 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.541219 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.541466 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.541531 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.552615 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.558557 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: E1122 09:14:22.558706 4846 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.560455 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.560513 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.560525 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.560542 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.560556 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.566651 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.581329 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.593182 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.603334 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.617178 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.629737 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.649270 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.663180 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:22Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.663743 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.663795 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.663809 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.663831 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.663841 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.772329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.772393 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.772415 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.772441 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.772456 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.875531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.875607 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.875632 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.875668 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.875695 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.978219 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.978263 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.978274 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.978293 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:22 crc kubenswrapper[4846]: I1122 09:14:22.978307 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:22Z","lastTransitionTime":"2025-11-22T09:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.080543 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.080594 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.080605 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.080624 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.080637 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.184604 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.184667 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.184681 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.184702 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.184715 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.287944 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.288001 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.288012 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.288034 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.288072 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.391408 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.391454 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.391467 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.391486 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.391497 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.494860 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.494914 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.494928 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.494945 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.494959 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.598014 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.598098 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.598113 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.598134 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.598147 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.672719 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:23 crc kubenswrapper[4846]: E1122 09:14:23.673035 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:23 crc kubenswrapper[4846]: E1122 09:14:23.673201 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:14:27.673171946 +0000 UTC m=+42.608861635 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.701691 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.701735 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.701747 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.701766 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.701777 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.804702 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.804757 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.804773 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.804809 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.804829 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.907891 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.907963 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.907978 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.907998 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:23 crc kubenswrapper[4846]: I1122 09:14:23.908012 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:23Z","lastTransitionTime":"2025-11-22T09:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.011379 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.011476 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.011499 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.011531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.011553 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.034790 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.034890 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.034913 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.034800 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:24 crc kubenswrapper[4846]: E1122 09:14:24.035367 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:24 crc kubenswrapper[4846]: E1122 09:14:24.035496 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:24 crc kubenswrapper[4846]: E1122 09:14:24.035595 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:24 crc kubenswrapper[4846]: E1122 09:14:24.035640 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.114843 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.114906 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.114918 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.114939 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.114950 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.217791 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.217836 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.217848 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.217865 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.217878 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.320583 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.320647 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.320670 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.320697 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.320715 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.423168 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.423208 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.423218 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.423240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.423269 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.526251 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.526307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.526321 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.526343 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.526357 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.629239 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.629294 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.629305 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.629330 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.629342 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.731845 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.731884 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.731896 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.731913 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.731924 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.834638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.834684 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.834696 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.834715 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.834726 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.938905 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.938954 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.938967 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.938993 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:24 crc kubenswrapper[4846]: I1122 09:14:24.939006 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:24Z","lastTransitionTime":"2025-11-22T09:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.042683 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.042741 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.042751 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.042769 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.042779 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.145835 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.145881 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.145894 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.145911 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.145923 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.252197 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.252294 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.252321 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.252356 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.252388 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.355462 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.355516 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.355524 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.355542 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.355552 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.458917 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.458970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.459009 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.459032 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.459096 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.562287 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.562337 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.562356 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.562379 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.562395 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.665891 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.665962 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.665993 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.666027 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.666083 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.769428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.769514 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.769525 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.769548 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.769559 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.872815 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.872859 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.872870 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.872888 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.872899 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.976347 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.976428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.976451 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.976482 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:25 crc kubenswrapper[4846]: I1122 09:14:25.976503 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:25Z","lastTransitionTime":"2025-11-22T09:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.034282 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.034341 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.034347 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.034281 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:26 crc kubenswrapper[4846]: E1122 09:14:26.034486 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:26 crc kubenswrapper[4846]: E1122 09:14:26.034583 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:26 crc kubenswrapper[4846]: E1122 09:14:26.034717 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:26 crc kubenswrapper[4846]: E1122 09:14:26.034865 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.055464 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.072851 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.079428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.079475 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.079492 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.079534 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.079570 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.088212 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.105158 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.126568 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.144145 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.168139 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9aac8f653441cb6fbe3c36c040f06938c1996c88ff85ef9ff4fa3eb05c92b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"or removal\\\\nI1122 09:14:18.320601 6140 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 09:14:18.320614 6140 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 09:14:18.320619 6140 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 09:14:18.320637 6140 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 09:14:18.320643 6140 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 09:14:18.320641 6140 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 09:14:18.320708 6140 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 09:14:18.320642 6140 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 09:14:18.320655 6140 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 09:14:18.320659 6140 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 09:14:18.320719 6140 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 09:14:18.320714 6140 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 09:14:18.320732 6140 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 09:14:18.320737 6140 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 09:14:18.320758 6140 factory.go:656] Stopping watch factory\\\\nI1122 09:14:18.320776 6140 ovnkube.go:599] Stopped ovnkube\\\\nI1122 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.181246 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.181296 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.181309 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.181328 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.181342 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.189323 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.207035 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.222307 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.236251 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.253233 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.269869 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.284098 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.284160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.284174 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.284194 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.284206 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.288348 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.300389 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.311248 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:26Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.386849 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.387142 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.387214 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.387304 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.387365 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.489908 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.489956 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.489968 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.489988 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.490006 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.593361 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.593428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.593447 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.593472 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.593493 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.697075 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.697111 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.697120 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.697163 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.697178 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.800430 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.800504 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.800519 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.800538 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.800553 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.903960 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.904000 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.904010 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.904026 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:26 crc kubenswrapper[4846]: I1122 09:14:26.904037 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:26Z","lastTransitionTime":"2025-11-22T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.007605 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.007644 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.007654 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.007673 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.007683 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.111098 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.111151 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.111162 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.111183 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.111194 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.214528 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.214596 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.214610 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.214633 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.214647 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.317321 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.317361 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.317371 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.317388 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.317398 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.420598 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.420696 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.420726 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.420760 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.420797 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.522660 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.522704 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.522714 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.522732 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.522745 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.625594 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.625650 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.625663 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.625691 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.625709 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.721326 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:27 crc kubenswrapper[4846]: E1122 09:14:27.721579 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:27 crc kubenswrapper[4846]: E1122 09:14:27.721687 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:14:35.721662766 +0000 UTC m=+50.657352435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.728571 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.728611 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.728620 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.728637 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.728651 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.831552 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.831634 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.831651 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.831684 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.831702 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.934657 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.934721 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.934736 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.934753 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:27 crc kubenswrapper[4846]: I1122 09:14:27.934765 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:27Z","lastTransitionTime":"2025-11-22T09:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.035234 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.035417 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.035264 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:28 crc kubenswrapper[4846]: E1122 09:14:28.035665 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.035723 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:28 crc kubenswrapper[4846]: E1122 09:14:28.036188 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:28 crc kubenswrapper[4846]: E1122 09:14:28.036414 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:28 crc kubenswrapper[4846]: E1122 09:14:28.036653 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.037387 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.037450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.037469 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.037502 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.037520 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.140428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.140533 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.140542 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.140560 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.140571 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.243704 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.243783 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.243801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.243835 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.243853 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.347757 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.347819 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.347848 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.347873 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.347889 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.450899 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.450958 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.450970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.450990 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.451006 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.554153 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.554237 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.554265 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.554294 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.554316 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.657732 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.657810 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.657830 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.657856 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.657873 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.760969 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.761023 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.761036 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.761093 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.761108 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.864849 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.864897 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.864907 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.864925 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.864938 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.967076 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.967124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.967136 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.967152 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:28 crc kubenswrapper[4846]: I1122 09:14:28.967162 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:28Z","lastTransitionTime":"2025-11-22T09:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.069899 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.069965 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.069975 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.069997 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.070011 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.172997 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.173097 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.173119 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.173143 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.173162 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.275999 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.276079 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.276092 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.276110 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.276121 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.379009 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.379121 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.379133 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.379161 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.379176 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.482946 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.483001 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.483013 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.483033 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.483064 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.585701 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.585769 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.585790 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.585821 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.585844 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.688445 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.688492 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.688504 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.688521 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.688532 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.791261 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.791304 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.791313 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.791327 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.791336 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.895494 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.895554 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.895566 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.895589 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.895600 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.998612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.998669 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.998680 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.998700 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:29 crc kubenswrapper[4846]: I1122 09:14:29.998723 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:29Z","lastTransitionTime":"2025-11-22T09:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.034427 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.034509 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.034493 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:30 crc kubenswrapper[4846]: E1122 09:14:30.034647 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.034733 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:30 crc kubenswrapper[4846]: E1122 09:14:30.034815 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:30 crc kubenswrapper[4846]: E1122 09:14:30.034970 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:30 crc kubenswrapper[4846]: E1122 09:14:30.035343 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.101842 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.101916 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.101927 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.101947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.101958 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.205691 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.205754 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.205790 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.205816 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.205833 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.308552 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.308597 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.308607 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.308625 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.308645 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.412424 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.412485 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.412499 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.412521 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.412532 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.515369 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.515435 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.515453 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.515479 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.515497 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.618635 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.618701 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.618724 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.618749 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.618771 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.725913 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.726025 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.726058 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.726087 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.726103 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.829184 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.829222 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.829229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.829244 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.829254 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.933067 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.933126 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.933139 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.933160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:30 crc kubenswrapper[4846]: I1122 09:14:30.933173 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:30Z","lastTransitionTime":"2025-11-22T09:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.036350 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.036408 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.036420 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.036438 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.036450 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.139753 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.139801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.139813 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.139830 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.139844 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.242844 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.242900 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.242910 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.242930 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.242941 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.345375 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.345419 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.345432 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.345453 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.345467 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.448384 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.448446 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.448456 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.448474 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.448487 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.551654 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.551698 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.551710 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.551726 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.551736 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.654464 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.654512 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.654521 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.654539 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.654550 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.756481 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.756532 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.756545 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.756568 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.756584 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.860487 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.861094 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.861118 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.861146 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.861165 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.964086 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.964133 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.964143 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.964163 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:31 crc kubenswrapper[4846]: I1122 09:14:31.964173 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:31Z","lastTransitionTime":"2025-11-22T09:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.034344 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.034483 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.034380 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.034490 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.034603 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.034739 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.034820 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.035117 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.066570 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.066874 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.067006 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.067142 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.067244 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.169491 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.169972 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.170160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.170313 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.170431 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.273382 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.273445 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.273455 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.273473 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.273493 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.376609 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.376706 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.376724 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.376750 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.376767 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.478864 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.478933 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.478947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.478970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.478984 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.582031 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.582106 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.582116 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.582133 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.582145 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.685126 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.685172 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.685182 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.685199 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.685210 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.702313 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.702363 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.702375 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.702400 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.702415 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.718156 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:32Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.722242 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.722282 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.722294 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.722312 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.722324 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.739475 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:32Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.744272 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.744460 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.744592 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.744692 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.744805 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.758969 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:32Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.764306 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.764484 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.764556 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.764631 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.764695 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.782536 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:32Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.786663 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.786697 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.786704 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.786719 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.786732 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.798454 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:32Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:32 crc kubenswrapper[4846]: E1122 09:14:32.798575 4846 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.800659 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.800792 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.800867 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.800955 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.801097 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.904454 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.904729 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.904795 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.904911 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:32 crc kubenswrapper[4846]: I1122 09:14:32.904971 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:32Z","lastTransitionTime":"2025-11-22T09:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.008307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.008689 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.008823 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.008935 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.009016 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.112354 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.112402 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.112414 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.112435 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.112449 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.216372 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.216459 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.216483 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.216516 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.216541 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.320299 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.320370 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.320382 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.320406 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.320420 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.423552 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.423609 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.423621 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.423642 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.423654 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.526366 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.526421 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.526437 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.526459 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.526474 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.630010 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.630088 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.630103 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.630124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.630139 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.733373 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.733425 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.733436 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.733455 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.733467 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.837380 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.837450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.837468 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.837490 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.837502 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.940923 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.941004 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.941024 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.941098 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:33 crc kubenswrapper[4846]: I1122 09:14:33.941141 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:33Z","lastTransitionTime":"2025-11-22T09:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.034951 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.035120 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:34 crc kubenswrapper[4846]: E1122 09:14:34.035208 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.035103 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.035294 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:34 crc kubenswrapper[4846]: E1122 09:14:34.035329 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:34 crc kubenswrapper[4846]: E1122 09:14:34.035436 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:34 crc kubenswrapper[4846]: E1122 09:14:34.035556 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.043802 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.043879 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.043902 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.043951 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.043985 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.146927 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.146992 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.147009 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.147036 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.147089 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.249583 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.249619 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.249627 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.249643 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.249653 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.353260 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.353323 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.353335 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.353359 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.353374 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.455886 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.455939 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.455950 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.455971 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.455983 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.558781 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.558826 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.558841 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.558861 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.558874 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.662187 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.662243 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.662256 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.662291 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.662305 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.765746 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.765775 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.765784 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.765799 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.765809 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.869119 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.869191 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.869202 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.869240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.869254 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.955355 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.956214 4846 scope.go:117] "RemoveContainer" containerID="8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.972141 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.972177 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.972187 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.972205 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.972215 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:34Z","lastTransitionTime":"2025-11-22T09:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.973146 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:34Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:34 crc kubenswrapper[4846]: I1122 09:14:34.989120 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:34Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.008127 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.023125 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.036397 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.050455 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.063598 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.075763 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.076228 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.076639 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.076724 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.076869 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.082463 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.096685 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.110919 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.122248 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.140100 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.163579 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.177150 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.179793 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.179844 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.179855 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.179873 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.179905 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.221178 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.249242 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.283083 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.283110 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.283119 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.283134 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.283144 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.386031 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.386094 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.386104 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.386121 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.386131 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.465923 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/1.log" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.468854 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.469256 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.483998 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.489065 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.489111 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.489124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.489145 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.489158 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.506944 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.521806 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.536373 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.550149 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.563948 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.580646 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.591005 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.591032 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.591065 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.591082 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.591109 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.598816 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.612940 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.628852 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.640448 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.651933 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.667729 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.682559 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.696769 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.700570 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.700606 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.700615 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.700630 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.700642 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.712578 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:35Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.744361 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:35 crc kubenswrapper[4846]: E1122 09:14:35.745083 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:35 crc kubenswrapper[4846]: E1122 09:14:35.745297 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:14:51.745268884 +0000 UTC m=+66.680958533 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.803860 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.803915 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.803928 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.803963 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.803976 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.907011 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.907068 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.907078 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.907093 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:35 crc kubenswrapper[4846]: I1122 09:14:35.907106 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:35Z","lastTransitionTime":"2025-11-22T09:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.010319 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.010376 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.010393 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.010416 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.010430 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.034962 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:36 crc kubenswrapper[4846]: E1122 09:14:36.035153 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.035278 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:36 crc kubenswrapper[4846]: E1122 09:14:36.035353 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.035500 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.035519 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:36 crc kubenswrapper[4846]: E1122 09:14:36.035806 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:36 crc kubenswrapper[4846]: E1122 09:14:36.035909 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.052942 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.066941 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.085907 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.108249 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.113208 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.113251 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.113269 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.113287 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.113298 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.123130 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.136572 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.150321 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.162938 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.179441 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.191605 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.201968 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.213420 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.215796 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.215839 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.215851 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.215870 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.215880 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.225614 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.238234 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.256597 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.272626 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.318818 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.318868 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.318881 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.318900 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.318913 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.423573 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.423625 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.423638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.423657 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.423668 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.474730 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/2.log" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.476127 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/1.log" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.479201 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098" exitCode=1 Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.479268 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.479327 4846 scope.go:117] "RemoveContainer" containerID="8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.480599 4846 scope.go:117] "RemoveContainer" containerID="81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098" Nov 22 09:14:36 crc kubenswrapper[4846]: E1122 09:14:36.480774 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.495547 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.508623 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.527153 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.527603 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.527616 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.527637 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.527651 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.532504 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c78f640e92aeddbe30400d08f4f3a3568d0c26f28317cd825ddc86250c36065\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:21Z\\\",\\\"message\\\":\\\"string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:21.012480 6354 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1122 09:14:21.012870 6354 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1122 09:14:21.012899 6354 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF1122 09:14:21.012909 6354 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.544971 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.561580 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.576472 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.590200 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.605007 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.618181 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.631174 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.631218 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.631232 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.631253 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.631272 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.630832 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.645579 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.656834 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.668619 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.679994 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.690884 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.699985 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:36Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.734316 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.734357 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.734369 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.734397 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.734408 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.837176 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.837229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.837243 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.837269 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.837283 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.939556 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.939617 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.939639 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.939665 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:36 crc kubenswrapper[4846]: I1122 09:14:36.939686 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:36Z","lastTransitionTime":"2025-11-22T09:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.042595 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.042638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.042649 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.042666 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.042679 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.146195 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.146249 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.146263 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.146288 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.146301 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.249282 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.249362 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.249436 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.249473 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.249499 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.352970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.353083 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.353112 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.353143 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.353164 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.456161 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.456239 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.456257 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.456288 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.456328 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.486637 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/2.log" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.491766 4846 scope.go:117] "RemoveContainer" containerID="81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098" Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.492150 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.509643 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.522601 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.534709 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.551682 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.561474 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.561540 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.561552 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.561572 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.561585 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.569604 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.581452 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.592474 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.603881 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.616022 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.628142 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.637713 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.648580 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.661494 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.664217 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.664261 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.664273 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.664291 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.664303 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.673448 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.689585 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.699242 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:37Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.767402 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.767436 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.767446 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.767463 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.767479 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.769261 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.769341 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.769399 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:15:09.76938359 +0000 UTC m=+84.705073239 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869642 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869697 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869710 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869730 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869734 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869741 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869818 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869838 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.869866 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.869894 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:15:09.869874111 +0000 UTC m=+84.805563760 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.869967 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.869982 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.869991 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870000 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870017 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870022 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:15:09.870014815 +0000 UTC m=+84.805704464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870027 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870084 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:15:09.870072796 +0000 UTC m=+84.805762445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870086 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: E1122 09:14:37.870122 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:15:09.870113537 +0000 UTC m=+84.805803186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.972880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.972930 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.972970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.972988 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:37 crc kubenswrapper[4846]: I1122 09:14:37.973001 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:37Z","lastTransitionTime":"2025-11-22T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.035151 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.035202 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:38 crc kubenswrapper[4846]: E1122 09:14:38.035316 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.035362 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:38 crc kubenswrapper[4846]: E1122 09:14:38.035496 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.035521 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:38 crc kubenswrapper[4846]: E1122 09:14:38.035570 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:38 crc kubenswrapper[4846]: E1122 09:14:38.035816 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.075994 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.076075 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.076088 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.076105 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.076181 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.179404 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.179447 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.179459 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.179477 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.179488 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.282374 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.282418 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.282431 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.282450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.282461 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.386145 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.386183 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.386194 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.386209 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.386224 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.489665 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.489727 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.489741 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.489761 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.489775 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.592128 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.592177 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.592188 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.592206 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.592219 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.694694 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.694760 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.694772 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.694794 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.694807 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.797617 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.797658 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.797669 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.797687 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.797701 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.803739 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.818249 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.820082 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.832017 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.845402 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.856746 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.870451 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.883704 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.896854 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.900503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.900544 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.900553 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.900582 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.900594 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:38Z","lastTransitionTime":"2025-11-22T09:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.908354 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.919108 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.935736 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.946731 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.957630 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.970072 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.980870 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:38 crc kubenswrapper[4846]: I1122 09:14:38.993516 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:38Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.003599 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.003636 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.003646 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.003662 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.003671 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.006416 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:39Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.106127 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.106176 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.106186 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.106204 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.106217 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.208706 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.209025 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.209112 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.209186 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.209341 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.312122 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.312156 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.312169 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.312187 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.312199 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.416468 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.416704 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.416802 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.416868 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.416927 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.520222 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.520290 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.520312 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.520339 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.520358 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.623404 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.623471 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.623493 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.623522 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.623542 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.727063 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.727119 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.727130 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.727149 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.727172 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.830654 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.830765 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.830804 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.830841 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.830870 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.934008 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.934097 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.934109 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.934152 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:39 crc kubenswrapper[4846]: I1122 09:14:39.934164 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:39Z","lastTransitionTime":"2025-11-22T09:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.032077 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.034486 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:40 crc kubenswrapper[4846]: E1122 09:14:40.034625 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.034756 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.034764 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.034757 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:40 crc kubenswrapper[4846]: E1122 09:14:40.034860 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:40 crc kubenswrapper[4846]: E1122 09:14:40.034985 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:40 crc kubenswrapper[4846]: E1122 09:14:40.035171 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.036570 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.036602 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.036615 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.036633 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.036648 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.048010 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.064775 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.083547 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.095895 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.110705 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.128709 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.139061 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.139114 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.139127 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.139144 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.139158 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.142771 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.157421 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.173521 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.189692 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.209177 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.223317 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.242339 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.242375 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.242387 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.242403 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.242417 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.243104 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.255433 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.271460 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.288277 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.306248 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:40Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.345830 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.345889 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.345898 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.345917 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.345927 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.448783 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.448850 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.448865 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.448889 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.448906 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.552338 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.552378 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.552392 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.552413 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.552430 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.656268 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.656363 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.656388 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.656425 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.656449 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.759497 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.759569 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.759587 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.759615 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.759634 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.862338 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.862385 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.862395 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.862409 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.862419 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.965005 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.965096 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.965116 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.965144 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:40 crc kubenswrapper[4846]: I1122 09:14:40.965252 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:40Z","lastTransitionTime":"2025-11-22T09:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.068118 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.068170 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.068180 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.068197 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.068207 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.171296 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.171340 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.171349 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.171365 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.171377 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.274300 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.274351 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.274361 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.274379 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.274388 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.377904 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.377953 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.377963 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.377982 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.377995 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.481218 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.481255 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.481265 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.481285 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.481296 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.584092 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.584144 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.584156 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.584176 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.584188 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.686770 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.686830 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.686849 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.686873 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.686899 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.789246 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.789293 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.789311 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.789329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.789341 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.892007 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.892092 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.892102 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.892120 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.892137 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.994498 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.994540 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.994551 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.994570 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:41 crc kubenswrapper[4846]: I1122 09:14:41.994581 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:41Z","lastTransitionTime":"2025-11-22T09:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.034681 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.034733 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.034771 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.034712 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.034849 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.034905 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.035010 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.035116 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.097020 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.097089 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.097102 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.097121 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.097133 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.200207 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.200295 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.200315 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.200343 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.200363 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.302506 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.302574 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.302584 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.302602 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.302612 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.406373 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.406424 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.406433 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.406450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.406459 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.513565 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.513638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.513657 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.513681 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.513699 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.616857 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.616944 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.616968 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.616998 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.617017 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.720099 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.720142 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.720155 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.720171 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.720182 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.801694 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.801741 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.801750 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.801768 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.801789 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.815700 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:42Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.821122 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.821158 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.821170 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.821186 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.821197 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.836600 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:42Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.841123 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.841179 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.841196 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.841214 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.841227 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.858875 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:42Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.862629 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.862681 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.862691 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.862708 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.862719 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.876672 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:42Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.881880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.881934 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.881944 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.881966 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.881977 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.894433 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:42Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:42 crc kubenswrapper[4846]: E1122 09:14:42.894557 4846 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.897418 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.897452 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.897463 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.897479 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:42 crc kubenswrapper[4846]: I1122 09:14:42.897490 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:42Z","lastTransitionTime":"2025-11-22T09:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.000394 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.000446 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.000459 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.000482 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.000497 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.103468 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.103553 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.103578 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.103612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.103641 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.206709 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.206768 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.206782 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.206806 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.206820 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.309348 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.309385 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.309394 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.309409 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.309421 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.412300 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.412352 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.412404 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.412427 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.412440 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.515735 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.515792 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.515806 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.515827 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.515839 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.618564 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.618608 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.618618 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.618634 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.618645 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.722292 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.722365 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.722384 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.722408 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.722426 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.826110 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.826183 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.826200 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.826227 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.826249 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.930699 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.930781 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.930799 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.930827 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:43 crc kubenswrapper[4846]: I1122 09:14:43.930849 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:43Z","lastTransitionTime":"2025-11-22T09:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.033598 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.033664 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.033677 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.033698 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.033712 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.034806 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.034936 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.034936 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:44 crc kubenswrapper[4846]: E1122 09:14:44.035178 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.035197 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:44 crc kubenswrapper[4846]: E1122 09:14:44.035291 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:44 crc kubenswrapper[4846]: E1122 09:14:44.035356 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:44 crc kubenswrapper[4846]: E1122 09:14:44.035668 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.137138 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.137222 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.137247 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.137281 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.137310 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.245871 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.246022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.246288 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.246334 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.246362 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.351309 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.351404 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.351428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.351460 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.351558 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.454658 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.454708 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.454720 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.454739 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.454752 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.558293 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.558346 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.558358 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.558377 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.558390 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.662334 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.662410 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.662443 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.662471 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.662492 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.765350 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.765428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.765450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.765479 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.765502 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.868921 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.868999 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.869022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.869067 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.869083 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.972915 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.973000 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.973027 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.973101 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:44 crc kubenswrapper[4846]: I1122 09:14:44.973130 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:44Z","lastTransitionTime":"2025-11-22T09:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.076027 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.076153 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.076192 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.076225 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.076251 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.179301 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.179368 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.179382 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.179410 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.179425 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.283323 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.283375 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.283387 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.283411 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.283435 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.385431 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.385479 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.385489 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.385508 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.385520 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.489150 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.489210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.489224 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.489243 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.489258 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.591762 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.591819 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.591832 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.591853 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.591866 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.694514 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.694557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.694567 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.694582 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.694592 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.798150 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.798206 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.798217 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.798239 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.798254 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.901526 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.901582 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.901592 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.901609 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:45 crc kubenswrapper[4846]: I1122 09:14:45.901621 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:45Z","lastTransitionTime":"2025-11-22T09:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.004879 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.004928 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.004940 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.004961 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.004977 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.034591 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.034668 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.034767 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.034870 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:46 crc kubenswrapper[4846]: E1122 09:14:46.034881 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:46 crc kubenswrapper[4846]: E1122 09:14:46.034985 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:46 crc kubenswrapper[4846]: E1122 09:14:46.035073 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:46 crc kubenswrapper[4846]: E1122 09:14:46.035112 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.055490 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.076736 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.097018 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.107522 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.107565 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.107578 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.107598 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.107616 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.113478 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.130546 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.147341 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.162680 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.179223 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.193870 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.206370 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.210415 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.210457 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.210470 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.210492 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.210510 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.219203 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.232910 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.247834 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.264333 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.280822 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.293312 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.306403 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:46Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.313340 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.313476 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.313605 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.313682 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.313751 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.417102 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.417140 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.417149 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.417165 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.417175 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.520161 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.520229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.520240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.520256 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.520290 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.624880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.624933 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.624948 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.624971 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.624984 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.729574 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.729612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.729625 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.729644 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.729656 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.832551 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.832655 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.832680 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.832712 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.832735 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.935134 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.935188 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.935201 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.935220 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:46 crc kubenswrapper[4846]: I1122 09:14:46.935230 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:46Z","lastTransitionTime":"2025-11-22T09:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.038174 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.038213 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.038223 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.038239 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.038250 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.140374 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.140422 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.140435 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.140455 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.140469 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.243074 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.243542 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.243601 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.243668 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.243729 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.345979 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.346017 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.346026 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.346115 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.346129 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.449307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.449694 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.450068 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.450347 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.450584 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.553726 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.554143 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.554269 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.554354 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.554473 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.658756 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.658846 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.658862 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.658884 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.658898 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.762249 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.763498 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.763710 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.763869 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.764008 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.867295 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.867329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.867339 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.867353 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.867363 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.969861 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.969947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.969973 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.970008 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:47 crc kubenswrapper[4846]: I1122 09:14:47.970032 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:47Z","lastTransitionTime":"2025-11-22T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.035263 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.035361 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.035263 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:48 crc kubenswrapper[4846]: E1122 09:14:48.035435 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.035459 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:48 crc kubenswrapper[4846]: E1122 09:14:48.035573 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:48 crc kubenswrapper[4846]: E1122 09:14:48.035666 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:48 crc kubenswrapper[4846]: E1122 09:14:48.035841 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.072895 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.072947 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.072957 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.072979 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.072991 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.176200 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.176252 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.176262 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.176279 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.176291 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.279143 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.279200 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.279212 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.279233 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.279247 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.381821 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.381877 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.381893 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.381913 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.381924 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.485612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.485671 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.485683 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.485701 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.485714 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.588770 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.588860 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.588883 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.588912 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.588934 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.692481 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.692545 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.692558 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.692578 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.692594 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.796147 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.796240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.796270 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.796304 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.796333 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.900109 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.900196 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.900220 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.900251 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:48 crc kubenswrapper[4846]: I1122 09:14:48.900276 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:48Z","lastTransitionTime":"2025-11-22T09:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.004352 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.004445 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.004467 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.004492 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.004507 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.107909 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.108408 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.108561 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.108762 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.108900 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.212018 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.212115 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.212152 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.212170 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.212181 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.315450 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.315491 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.315503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.315522 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.315534 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.420434 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.420833 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.420865 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.420901 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.420925 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.524911 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.525388 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.525564 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.525717 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.525827 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.629248 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.629306 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.629324 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.629352 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.629371 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.732743 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.732988 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.733078 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.733160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.733230 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.836198 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.836759 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.836823 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.836902 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.836966 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.940318 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.940611 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.940705 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.940811 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:49 crc kubenswrapper[4846]: I1122 09:14:49.940947 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:49Z","lastTransitionTime":"2025-11-22T09:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.035222 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.035804 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.035967 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:50 crc kubenswrapper[4846]: E1122 09:14:50.036103 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.035379 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.036182 4846 scope.go:117] "RemoveContainer" containerID="81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098" Nov 22 09:14:50 crc kubenswrapper[4846]: E1122 09:14:50.036187 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:50 crc kubenswrapper[4846]: E1122 09:14:50.036394 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:50 crc kubenswrapper[4846]: E1122 09:14:50.036470 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:14:50 crc kubenswrapper[4846]: E1122 09:14:50.036750 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.044192 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.044242 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.044252 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.044267 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.044278 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.147730 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.147838 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.147868 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.147902 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.147928 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.250230 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.250285 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.250297 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.250317 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.250331 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.353497 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.353550 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.353559 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.353576 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.353586 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.456448 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.456500 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.456511 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.456530 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.456542 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.558634 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.558688 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.558702 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.558723 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.558736 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.662241 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.662285 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.662293 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.662307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.662317 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.765570 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.765633 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.765646 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.765667 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.765688 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.867819 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.867859 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.867870 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.867886 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.867896 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.970786 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.970826 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.970836 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.970851 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:50 crc kubenswrapper[4846]: I1122 09:14:50.970860 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:50Z","lastTransitionTime":"2025-11-22T09:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.074128 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.074180 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.074194 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.074217 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.074232 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.177531 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.177589 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.177601 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.177618 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.177631 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.280319 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.280391 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.280413 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.280439 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.280457 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.383035 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.383115 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.383127 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.383465 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.383476 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.486938 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.486991 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.487001 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.487019 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.487034 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.590120 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.590185 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.590210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.590235 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.590251 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.726817 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.726867 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.726880 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.726903 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.726916 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.746460 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:51 crc kubenswrapper[4846]: E1122 09:14:51.746629 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:51 crc kubenswrapper[4846]: E1122 09:14:51.746711 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:15:23.746677166 +0000 UTC m=+98.682366815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.830232 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.830291 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.830304 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.830329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.830343 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.934015 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.934106 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.934120 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.934168 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:51 crc kubenswrapper[4846]: I1122 09:14:51.934188 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:51Z","lastTransitionTime":"2025-11-22T09:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.034771 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:52 crc kubenswrapper[4846]: E1122 09:14:52.035337 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.034890 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:52 crc kubenswrapper[4846]: E1122 09:14:52.035462 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.034934 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:52 crc kubenswrapper[4846]: E1122 09:14:52.035523 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.034807 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:52 crc kubenswrapper[4846]: E1122 09:14:52.035627 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.036221 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.036257 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.036266 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.036283 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.036293 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.139563 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.139610 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.139619 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.139636 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.139647 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.242779 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.242833 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.242847 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.242867 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.242883 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.346418 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.346471 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.346480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.346498 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.346510 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.449672 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.450258 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.450504 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.450747 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.450997 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.554413 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.554484 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.554497 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.554517 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.554529 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.657772 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.657828 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.657841 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.657865 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.657876 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.760644 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.760713 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.760726 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.760746 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.760759 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.863730 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.863779 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.863793 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.863813 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.863824 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.966214 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.966251 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.966260 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.966277 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.966288 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.978586 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.978624 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.978632 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.978647 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.978657 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:52 crc kubenswrapper[4846]: E1122 09:14:52.991328 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:52Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.995896 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.996292 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.996407 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.996515 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:52 crc kubenswrapper[4846]: I1122 09:14:52.996596 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:52Z","lastTransitionTime":"2025-11-22T09:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: E1122 09:14:53.009471 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:53Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.014126 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.014172 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.014185 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.014207 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.014218 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: E1122 09:14:53.026337 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:53Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.030000 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.030054 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.030066 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.030084 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.030096 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: E1122 09:14:53.042153 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:53Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.046803 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.046931 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.046999 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.047110 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.047190 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: E1122 09:14:53.060587 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:53Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:53 crc kubenswrapper[4846]: E1122 09:14:53.061033 4846 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.070430 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.070485 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.070499 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.070521 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.070539 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.173968 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.174017 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.174030 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.174109 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.174141 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.277019 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.277087 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.277096 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.277115 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.277125 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.380035 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.380102 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.380114 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.380132 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.380143 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.483015 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.483084 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.483094 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.483112 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.483121 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.585782 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.585825 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.585838 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.585857 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.585875 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.688845 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.688899 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.688911 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.688934 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.688948 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.791955 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.791995 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.792005 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.792022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.792033 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.893856 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.893897 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.893909 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.893928 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.893944 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.996976 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.997023 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.997031 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.997064 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:53 crc kubenswrapper[4846]: I1122 09:14:53.997077 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:53Z","lastTransitionTime":"2025-11-22T09:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.034389 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.034472 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.034402 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.034816 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:54 crc kubenswrapper[4846]: E1122 09:14:54.034825 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:54 crc kubenswrapper[4846]: E1122 09:14:54.034892 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:54 crc kubenswrapper[4846]: E1122 09:14:54.034917 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:54 crc kubenswrapper[4846]: E1122 09:14:54.034945 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.100362 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.100417 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.100431 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.100453 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.100468 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.209259 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.209317 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.209328 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.209348 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.209361 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.311976 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.312011 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.312020 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.312034 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.312061 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.414801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.414847 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.414860 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.414879 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.414890 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.518108 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.518147 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.518159 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.518180 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.518194 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.621225 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.621296 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.621308 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.621327 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.621338 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.724124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.724198 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.724216 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.724244 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.724268 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.827497 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.827553 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.827571 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.827592 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.827607 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.930277 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.930351 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.930370 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.930392 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:54 crc kubenswrapper[4846]: I1122 09:14:54.930408 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:54Z","lastTransitionTime":"2025-11-22T09:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.033638 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.033707 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.033735 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.033775 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.033802 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.137862 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.137944 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.137969 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.138002 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.138026 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.241322 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.241386 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.241401 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.241423 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.241438 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.344480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.344546 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.344558 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.344579 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.344592 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.447419 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.447457 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.447469 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.447487 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.447503 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.550136 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.550193 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.550204 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.550223 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.550236 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.652956 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.653011 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.653023 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.653068 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.653088 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.756063 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.756106 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.756120 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.756140 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.756156 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.859557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.859641 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.859655 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.859676 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.859690 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.962304 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.962346 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.962357 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.962377 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:55 crc kubenswrapper[4846]: I1122 09:14:55.962387 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:55Z","lastTransitionTime":"2025-11-22T09:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.035253 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.035407 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.035435 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:56 crc kubenswrapper[4846]: E1122 09:14:56.035562 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.035798 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:56 crc kubenswrapper[4846]: E1122 09:14:56.035875 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:56 crc kubenswrapper[4846]: E1122 09:14:56.036081 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:56 crc kubenswrapper[4846]: E1122 09:14:56.036336 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.051460 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.065452 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.065494 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.065507 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.065524 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.065538 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.068960 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.086116 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.098990 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.111686 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.123405 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.137871 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.157838 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.168480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.168530 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.168539 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.168557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.168569 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.174190 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.187034 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.200976 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.215714 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.230636 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.245034 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.261837 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.271993 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.272059 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.272071 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.272091 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.272101 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.276631 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.299849 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.374730 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.374784 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.374794 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.374833 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.374845 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.478021 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.478092 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.478106 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.478124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.478136 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.558955 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/0.log" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.559011 4846 generic.go:334] "Generic (PLEG): container finished" podID="9aec6a38-e6e4-4009-95d2-6a179c7fac04" containerID="0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd" exitCode=1 Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.559067 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerDied","Data":"0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.559546 4846 scope.go:117] "RemoveContainer" containerID="0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.573518 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.580868 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.580910 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.580921 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.580939 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.580954 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.587756 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.607688 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.626457 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.641669 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.654695 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.667972 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.683751 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:55Z\\\",\\\"message\\\":\\\"2025-11-22T09:14:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0\\\\n2025-11-22T09:14:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0 to /host/opt/cni/bin/\\\\n2025-11-22T09:14:10Z [verbose] multus-daemon started\\\\n2025-11-22T09:14:10Z [verbose] Readiness Indicator file check\\\\n2025-11-22T09:14:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.684005 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.684036 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.684062 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.684077 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.684087 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.697739 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.710549 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.723386 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.733603 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.744313 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.757785 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.776948 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.787425 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.787466 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.787480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.787499 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.787514 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.789364 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.804582 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:56Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.890097 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.890155 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.890165 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.890184 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.890195 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.992022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.992120 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.992142 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.992173 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:56 crc kubenswrapper[4846]: I1122 09:14:56.992196 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:56Z","lastTransitionTime":"2025-11-22T09:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.095293 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.095359 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.095373 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.095397 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.095411 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.198891 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.198936 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.198948 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.198963 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.198976 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.302690 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.302743 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.302756 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.302774 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.302785 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.405035 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.405102 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.405113 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.405133 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.405147 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.508418 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.508489 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.508508 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.508530 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.508544 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.565829 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/0.log" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.565901 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerStarted","Data":"8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.583006 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.595661 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.611853 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.611904 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.611919 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.611939 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.611951 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.612581 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.626098 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:55Z\\\",\\\"message\\\":\\\"2025-11-22T09:14:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0\\\\n2025-11-22T09:14:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0 to /host/opt/cni/bin/\\\\n2025-11-22T09:14:10Z [verbose] multus-daemon started\\\\n2025-11-22T09:14:10Z [verbose] Readiness Indicator file check\\\\n2025-11-22T09:14:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.639740 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.652223 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.663763 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.675916 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.688250 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.702723 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.714846 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.715123 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.715196 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.715210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.715230 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.715241 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.725905 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.738156 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.752483 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.765605 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.784524 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.796422 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:14:57Z is after 2025-08-24T17:21:41Z" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.818320 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.818378 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.818391 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.818409 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.818421 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.920735 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.920779 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.920792 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.920813 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:57 crc kubenswrapper[4846]: I1122 09:14:57.920830 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:57Z","lastTransitionTime":"2025-11-22T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.022618 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.022667 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.022678 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.022727 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.022738 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.034908 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:14:58 crc kubenswrapper[4846]: E1122 09:14:58.035104 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.035282 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.035352 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.035349 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:14:58 crc kubenswrapper[4846]: E1122 09:14:58.035453 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:14:58 crc kubenswrapper[4846]: E1122 09:14:58.035551 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:14:58 crc kubenswrapper[4846]: E1122 09:14:58.035893 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.125771 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.125818 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.125829 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.125844 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.125857 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.233263 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.233313 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.233326 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.233350 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.233377 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.336010 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.336179 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.336200 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.336226 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.336242 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.439033 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.439105 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.439117 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.439136 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.439146 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.542582 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.542640 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.542650 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.542671 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.542686 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.645961 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.646019 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.646032 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.646070 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.646085 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.748038 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.748142 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.748153 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.748173 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.748185 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.850772 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.850830 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.850840 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.850861 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.850872 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.953489 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.953530 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.953540 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.953556 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:58 crc kubenswrapper[4846]: I1122 09:14:58.953566 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:58Z","lastTransitionTime":"2025-11-22T09:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.056371 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.056439 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.056452 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.056473 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.056486 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.159866 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.160437 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.160505 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.160614 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.160700 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.262722 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.262765 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.262774 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.262789 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.262799 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.366918 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.366965 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.366981 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.367063 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.367081 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.470085 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.470128 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.470137 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.470153 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.470165 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.572192 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.572233 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.572242 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.572257 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.572269 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.675202 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.675275 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.675299 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.675331 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.675357 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.777915 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.777960 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.777972 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.777994 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.778014 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.881174 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.881488 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.881557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.881634 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.881690 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.985006 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.985072 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.985111 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.985128 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:14:59 crc kubenswrapper[4846]: I1122 09:14:59.985137 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:14:59Z","lastTransitionTime":"2025-11-22T09:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.034885 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.034885 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.034899 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:00 crc kubenswrapper[4846]: E1122 09:15:00.035084 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.035214 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:00 crc kubenswrapper[4846]: E1122 09:15:00.035279 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:00 crc kubenswrapper[4846]: E1122 09:15:00.035443 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:00 crc kubenswrapper[4846]: E1122 09:15:00.035496 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.087020 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.087356 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.087527 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.087665 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.087786 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.190577 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.190619 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.190650 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.190667 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.190676 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.293743 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.293805 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.293826 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.293857 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.293873 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.396510 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.396557 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.396566 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.396583 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.396592 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.499416 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.499458 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.499470 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.499488 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.499502 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.601398 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.601436 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.601446 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.601462 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.601472 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.704618 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.704665 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.704674 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.704689 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.704700 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.814888 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.815275 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.815368 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.815463 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.815543 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.918916 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.918968 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.918985 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.919009 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:00 crc kubenswrapper[4846]: I1122 09:15:00.919028 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:00Z","lastTransitionTime":"2025-11-22T09:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.022991 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.023433 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.023539 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.023711 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.023810 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.126457 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.126503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.126517 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.126540 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.126554 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.229136 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.229196 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.229210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.229229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.229242 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.331717 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.332210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.332311 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.332504 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.332648 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.435294 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.435959 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.436161 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.436333 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.436460 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.540318 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.540382 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.540396 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.540418 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.540432 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.642753 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.643108 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.643213 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.643305 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.643387 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.784195 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.784234 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.784242 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.784257 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.784267 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.886884 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.887176 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.887277 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.887371 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.887459 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.990181 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.990228 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.990238 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.990255 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:01 crc kubenswrapper[4846]: I1122 09:15:01.990265 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:01Z","lastTransitionTime":"2025-11-22T09:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.034873 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.034905 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.035018 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:02 crc kubenswrapper[4846]: E1122 09:15:02.035196 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:02 crc kubenswrapper[4846]: E1122 09:15:02.035491 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:02 crc kubenswrapper[4846]: E1122 09:15:02.035615 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.034905 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:02 crc kubenswrapper[4846]: E1122 09:15:02.036065 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.036501 4846 scope.go:117] "RemoveContainer" containerID="81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.093477 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.093688 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.093789 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.093897 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.093985 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.197324 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.197782 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.198186 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.198381 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.198536 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.301835 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.301882 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.301895 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.301916 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.301928 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.404340 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.404381 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.404390 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.404411 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.404421 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.507433 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.507484 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.507497 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.507519 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.507532 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.609887 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.609952 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.609967 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.609989 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.610005 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.712329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.712371 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.712381 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.712398 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.712413 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.815037 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.815093 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.815105 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.815121 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.815131 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.918130 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.918176 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.918187 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.918207 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:02 crc kubenswrapper[4846]: I1122 09:15:02.918221 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:02Z","lastTransitionTime":"2025-11-22T09:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.020329 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.020387 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.020400 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.020418 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.020431 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.067568 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.123510 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.123555 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.123567 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.123586 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.123600 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.199337 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.199392 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.199403 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.199420 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.199433 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.214556 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.218607 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.218647 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.218659 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.218674 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.218684 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.231353 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.235259 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.235480 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.235546 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.235618 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.235678 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.249087 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.254481 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.254628 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.254693 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.254761 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.254854 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.266297 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.270630 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.270904 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.271022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.271144 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.271222 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.284222 4846 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ee6693c1-3f64-4a2f-bacb-fc8ddbe6885e\\\",\\\"systemUUID\\\":\\\"fa21c007-e82e-49e1-be5e-f6cba7f9397a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.284356 4846 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.286690 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.286799 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.286883 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.286970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.287037 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.390217 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.390258 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.390267 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.390283 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.390293 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.492839 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.492886 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.492898 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.492912 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.492923 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.585446 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/3.log" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.586245 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/2.log" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.589971 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" exitCode=1 Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.590078 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.590155 4846 scope.go:117] "RemoveContainer" containerID="81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.591442 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:15:03 crc kubenswrapper[4846]: E1122 09:15:03.591663 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.595376 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.595409 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.595420 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.595440 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.595453 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.611893 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.627738 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.651017 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e5bdcf862b9493a5950a3d82ca2746c1583b7a01fd348d53e56b8915fc7098\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:35Z\\\",\\\"message\\\":\\\".LB{Name:\\\\\\\"Service_openshift-kube-scheduler-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-scheduler-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.233\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 09:14:35.770980 6557 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF1122 09:14:35.770983 6557 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"re column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 09:15:03.411375 6908 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-4h26m in node crc\\\\nI1122 09:15:03.412272 6908 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-4h26m after 0 failed attempt(s)\\\\nI1122 09:15:03.412278 6908 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.667466 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.682817 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.697137 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.697849 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.697900 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.697918 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.697945 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.697962 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.713611 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.729239 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:55Z\\\",\\\"message\\\":\\\"2025-11-22T09:14:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0\\\\n2025-11-22T09:14:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0 to /host/opt/cni/bin/\\\\n2025-11-22T09:14:10Z [verbose] multus-daemon started\\\\n2025-11-22T09:14:10Z [verbose] Readiness Indicator file check\\\\n2025-11-22T09:14:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.748755 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.768775 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.784675 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.799635 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.800626 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.801259 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.801287 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.801341 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.801363 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.818969 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.846209 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fa61668-18c9-4a11-a3f8-d996a18e5b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce3a473d395602dc4a069e17443c0dd35f1a07bf91449ffd385d2c33d15f42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc9369cc8aeb5063c77b2903f11b072e13ad00b65213695b24aece2781447c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba5b11f0d56e4061408321dc3a81fa978fe16802d273f013ed469ddd30fd59c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97534396f697eeceaa32da9cb02f110cc9a3bbe90f98b8ccfac13d2d692fd13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460f9a7df0be432a0284ae062957ae60158969f4b6cb7e8ae907d10ddd488a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.863331 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.876853 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.890230 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.905071 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.905124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.905137 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.905160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.905171 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:03Z","lastTransitionTime":"2025-11-22T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:03 crc kubenswrapper[4846]: I1122 09:15:03.905170 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:03Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.008925 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.008991 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.009010 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.009124 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.009146 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.035169 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.035181 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.035274 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.035316 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:04 crc kubenswrapper[4846]: E1122 09:15:04.035495 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:04 crc kubenswrapper[4846]: E1122 09:15:04.035746 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:04 crc kubenswrapper[4846]: E1122 09:15:04.035999 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:04 crc kubenswrapper[4846]: E1122 09:15:04.036208 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.111775 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.111857 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.111868 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.111914 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.111929 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.215113 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.215164 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.215173 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.215192 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.215204 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.317573 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.317634 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.317643 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.317658 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.317668 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.420380 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.420469 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.420503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.420525 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.420535 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.523243 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.523459 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.523482 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.523511 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.523534 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.597624 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/3.log" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.627003 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.627079 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.627091 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.627112 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.627123 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.730431 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.730491 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.730502 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.730522 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.730538 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.833562 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.833649 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.833663 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.833708 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.833730 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.937362 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.937424 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.937438 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.937466 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.937483 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:04Z","lastTransitionTime":"2025-11-22T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.955616 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.957570 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:15:04 crc kubenswrapper[4846]: E1122 09:15:04.958402 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.977338 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:04Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:04 crc kubenswrapper[4846]: I1122 09:15:04.995306 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:55Z\\\",\\\"message\\\":\\\"2025-11-22T09:14:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0\\\\n2025-11-22T09:14:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0 to /host/opt/cni/bin/\\\\n2025-11-22T09:14:10Z [verbose] multus-daemon started\\\\n2025-11-22T09:14:10Z [verbose] Readiness Indicator file check\\\\n2025-11-22T09:14:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:04Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.014125 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.029507 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.040189 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.040236 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.040247 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.040266 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.040276 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.048792 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.069696 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.086774 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.098309 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.109835 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.122456 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.135467 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.144532 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.144579 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.144591 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.144609 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.144621 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.149484 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.185990 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fa61668-18c9-4a11-a3f8-d996a18e5b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce3a473d395602dc4a069e17443c0dd35f1a07bf91449ffd385d2c33d15f42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc9369cc8aeb5063c77b2903f11b072e13ad00b65213695b24aece2781447c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba5b11f0d56e4061408321dc3a81fa978fe16802d273f013ed469ddd30fd59c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97534396f697eeceaa32da9cb02f110cc9a3bbe90f98b8ccfac13d2d692fd13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460f9a7df0be432a0284ae062957ae60158969f4b6cb7e8ae907d10ddd488a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.203812 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.223529 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"re column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 09:15:03.411375 6908 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-4h26m in node crc\\\\nI1122 09:15:03.412272 6908 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-4h26m after 0 failed attempt(s)\\\\nI1122 09:15:03.412278 6908 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:15:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.238572 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.246669 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.246703 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.246711 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.246726 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.246735 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.253962 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.270279 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:05Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.355692 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.355750 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.355763 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.355784 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.355797 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.457774 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.457827 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.457848 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.457868 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.457881 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.561402 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.561462 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.561476 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.561494 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.561508 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.664520 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.664567 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.664576 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.664594 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.664603 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.768804 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.768869 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.768888 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.768915 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.768966 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.871612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.871666 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.871678 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.871697 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.871709 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.974184 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.974229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.974241 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.974260 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:05 crc kubenswrapper[4846]: I1122 09:15:05.974271 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:05Z","lastTransitionTime":"2025-11-22T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.034564 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.034645 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.034595 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.034764 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:06 crc kubenswrapper[4846]: E1122 09:15:06.034753 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:06 crc kubenswrapper[4846]: E1122 09:15:06.034818 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:06 crc kubenswrapper[4846]: E1122 09:15:06.034892 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:06 crc kubenswrapper[4846]: E1122 09:15:06.035180 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.049314 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d221887-c475-423e-a65a-cb454d5b2e37\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403a4889b0deda98f0615e85a721dc36998b8ced41683358baf81285ffbdaf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://11cedca43cd4e2803e51659370b91d92f872561267164947786c123f3af07f36\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6dbf12d4116033e9a44dfd64df5a1a9b693431b3634480c0124f72027df438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c105b76ad6a55143e1f4d0dad9c827e9a09cb08036417969532aac3e03a66b8c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b598568859b7cecfc6d810331b528d0429f9c4852f8de77be1faf8768883ef4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"t-ca-file\\\\\\\"\\\\nI1122 09:14:05.777636 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1122 09:14:05.777730 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1122 09:14:05.778480 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1763802830\\\\\\\\\\\\\\\" (2025-11-22 09:13:49 +0000 UTC to 2025-12-22 09:13:50 +0000 UTC (now=2025-11-22 09:14:05.778448457 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778665 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763802840\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763802840\\\\\\\\\\\\\\\" (2025-11-22 08:14:00 +0000 UTC to 2026-11-22 08:14:00 +0000 UTC (now=2025-11-22 09:14:05.778644503 +0000 UTC))\\\\\\\"\\\\nI1122 09:14:05.778697 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1122 09:14:05.778726 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1122 09:14:05.778776 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-230259647/tls.crt::/tmp/serving-cert-230259647/tls.key\\\\\\\"\\\\nI1122 09:14:05.780336 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1122 09:14:05.781386 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781427 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1122 09:14:05.781809 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1122 09:14:05.781956 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://964d39d557d91eeb5a0708e176ad8a4fa844ef8a974a4a15eb885429b284c68b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70016e07399c834b6978bd88e70969aed201762329f2e36d699775023e4d8e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.062476 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6689f712-a146-4e6c-b428-02663f7d5906\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2421789414be0566b9e054990def55fa91dc75dd3e1244d4a7dca86c0aafc17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40455af6104d8aa70c20ac13f028d0c69e35e75e7c8f4e2fca52ba27ba956a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7342fe923d5b1956317e71de0a3772d27d9597f42fb3a25c1e7ed3dc2359e1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f19d7c93cac0c8c507f780ddd295e2562369fdb12df527691171d0058fa221eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.076825 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.076855 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.076866 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.076883 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.076896 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.077410 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4686c3dda53f81985564dfe1d0f0a819c5826e27076b1e6c36fc4cf733772a1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.091260 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hbcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aec6a38-e6e4-4009-95d2-6a179c7fac04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:14:55Z\\\",\\\"message\\\":\\\"2025-11-22T09:14:09+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0\\\\n2025-11-22T09:14:09+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38dc17f2-7f92-45e4-b9a5-6fa8cc47d4f0 to /host/opt/cni/bin/\\\\n2025-11-22T09:14:10Z [verbose] multus-daemon started\\\\n2025-11-22T09:14:10Z [verbose] Readiness Indicator file check\\\\n2025-11-22T09:14:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wm7zc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hbcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.110578 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4h26m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c1f212-c2f8-4c90-bd30-57ed4dc2fc84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0044a96db4cd015b7c1eecbe0f709440bb88732c7a98216855776220a5b270f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbe33a042407ca1b2d0c48b47853fcc0b4480c3f1352a1315a680837e931d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7523b571b7d77640d972ec3e96e33afa6094e55fbe7955c55712456319fbbf4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://026417050ce119245013e104e5ae3270ac154d3c52ddfb2a92dcee4ce46c2257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e71cbbb38a00610857d24612ffa0b33555d004035ce4c2097bfefd8ce13f07ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5779ea6758040b285a1e23907056480205590a7d9ce46617fbe8dd6ab7ab9f6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48bdd4691272ac54c3fe87a3171651f11cc478627d707931d3272d856b679417\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pnxjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4h26m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.126643 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.142361 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.154228 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86a01cc5-5438-4978-8919-2d24f665922a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b719d6ecbf360e0d67c4772b05202654cd321092013bbcc4da99c1e4b53dd70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ts2fd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c59mw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.169629 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-grx77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8814a472-d38b-4083-9294-d48a525987c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c537a8500d08235493daf3a8dc1fb784a16fe7b8f4da096ce657d2e29dd039e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppnm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-grx77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.178736 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.178789 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.178801 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.178817 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.178827 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.194481 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fa61668-18c9-4a11-a3f8-d996a18e5b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ce3a473d395602dc4a069e17443c0dd35f1a07bf91449ffd385d2c33d15f42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc9369cc8aeb5063c77b2903f11b072e13ad00b65213695b24aece2781447c6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba5b11f0d56e4061408321dc3a81fa978fe16802d273f013ed469ddd30fd59c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97534396f697eeceaa32da9cb02f110cc9a3bbe90f98b8ccfac13d2d692fd13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://460f9a7df0be432a0284ae062957ae60158969f4b6cb7e8ae907d10ddd488a00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9807ac6c274e3298628752bd2f94ee3115f49e18bbaac261b01ccac34d899589\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25bea4a5838fbaab7ee149b0c094eee1b75d2df12560b370f9b75b61f228e2f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9611dcbc5d73d0e37ff3e99b39e84a783a6b01aa2544711bc46ba5519cb35e01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.209401 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2490717e-d429-480d-af6c-68d0795e34b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a31f8661df06a55e38dc2bf55ab5de5a3647a955f5123ea72ecfd0800a43de1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d25e922cf6d5159812639733492c56662740ce208b0b12ba358862a8d720cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c17a61b1ff5fa4465232555f658071fd602cb591a3cca7ecc6a1f713be80d748\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://214919699d42b79a599edf42f60904c80caf76d330e02f757325436d96d905c7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:13:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.224471 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76cbc2330512cd961848b9b1a94506ae485cbe2e620c9dc3129f6f32b71ff71b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.237523 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q52w8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2300dfaf-ea26-4b33-8d8c-ab337aa56402\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b1658a988d62acdb8460d698acb44d36581875269d2a82793d789c468e1f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hlpfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q52w8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.250876 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-79xpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79bf3c4-87ae-4009-9a11-d26130912fef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnps2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-79xpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.268330 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://143b809038bbdfc0e652decb51c05c967135b852744227afc5198171e86857de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da5f3c14f85d14a850cc06e3d1da0d286357c3b72ea1f8b13234376969a343e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.283214 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.283274 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.283288 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.283307 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.283320 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.284347 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.308561 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c874da16-5eda-477e-bbd5-e5c105dc7a07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T09:15:03Z\\\",\\\"message\\\":\\\"re column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 09:15:03.411375 6908 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-4h26m in node crc\\\\nI1122 09:15:03.412272 6908 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-4h26m after 0 failed attempt(s)\\\\nI1122 09:15:03.412278 6908 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T09:15:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scw9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kws67\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.323286 4846 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11b0a86f-726c-4264-86f0-3691daeebe8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a71d6ba8ab14f3d477a97e4ccf35def84102baa575debaa4e7aa2719567c75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://956c91bd9091fe31b7ad72b8c3febb3ebe84ede901819aafdf45da1c95417c65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T09:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g5x2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:14:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-twrxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T09:15:06Z is after 2025-08-24T17:21:41Z" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.385505 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.385564 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.385573 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.385592 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.385603 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.489834 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.489879 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.489889 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.489906 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.489918 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.593114 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.593185 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.593197 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.593216 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.593252 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.696661 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.696719 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.696731 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.696753 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.696765 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.799294 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.799350 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.799375 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.799402 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.799420 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.902929 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.902968 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.902977 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.902995 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:06 crc kubenswrapper[4846]: I1122 09:15:06.903011 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:06Z","lastTransitionTime":"2025-11-22T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.006095 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.006170 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.006187 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.006214 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.006230 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.109651 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.109720 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.109741 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.109765 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.109779 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.212867 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.212923 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.212937 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.212954 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.212966 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.315821 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.315869 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.315878 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.315901 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.315912 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.419504 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.419577 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.419589 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.419608 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.419619 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.522706 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.522811 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.522838 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.522867 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.522890 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.625650 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.625701 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.625714 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.625732 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.625745 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.728180 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.728222 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.728232 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.728248 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.728258 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.830240 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.830280 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.830292 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.830310 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.830325 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.934560 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.934643 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.934661 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.934692 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:07 crc kubenswrapper[4846]: I1122 09:15:07.934711 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:07Z","lastTransitionTime":"2025-11-22T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.035402 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.035465 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:08 crc kubenswrapper[4846]: E1122 09:15:08.035612 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:08 crc kubenswrapper[4846]: E1122 09:15:08.036037 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.036726 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.036847 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:08 crc kubenswrapper[4846]: E1122 09:15:08.036923 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:08 crc kubenswrapper[4846]: E1122 09:15:08.036939 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.037927 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.037980 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.037991 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.038012 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.038029 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.141160 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.141211 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.141223 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.141241 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.141254 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.244805 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.244862 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.244875 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.244903 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.244916 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.347477 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.347523 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.347535 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.347555 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.347566 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.449680 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.449717 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.449727 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.449744 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.449760 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.551438 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.551475 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.551486 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.551503 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.551514 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.654275 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.654318 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.654330 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.654347 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.654358 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.757780 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.757820 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.757829 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.757844 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.757854 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.860566 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.860636 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.860660 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.860685 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.860702 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.963190 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.963241 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.963253 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.963273 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:08 crc kubenswrapper[4846]: I1122 09:15:08.963286 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:08Z","lastTransitionTime":"2025-11-22T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.066165 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.066210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.066221 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.066239 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.066254 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.168841 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.168901 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.168912 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.168931 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.168960 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.271535 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.271582 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.271592 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.271607 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.271616 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.375234 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.375312 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.375324 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.375362 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.375375 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.477722 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.477798 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.477818 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.477859 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.477872 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.580586 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.580622 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.580632 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.580648 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.580660 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.685169 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.685216 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.685247 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.685265 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.685278 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.788225 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.788310 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.788339 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.788372 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.788397 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.865380 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.865598 4846 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.865746 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.865709063 +0000 UTC m=+148.801398752 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.891125 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.891211 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.891246 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.891281 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.891302 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.966829 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.966974 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967037 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.966995018 +0000 UTC m=+148.902684677 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.967137 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.967229 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967227 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967355 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967374 4846 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967369 4846 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967403 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967431 4846 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967435 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.96742484 +0000 UTC m=+148.903114499 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967448 4846 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967495 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.967464071 +0000 UTC m=+148.903153750 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 09:15:09 crc kubenswrapper[4846]: E1122 09:15:09.967537 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.967517413 +0000 UTC m=+148.903207102 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.994213 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.994274 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.994296 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.994321 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:09 crc kubenswrapper[4846]: I1122 09:15:09.994340 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:09Z","lastTransitionTime":"2025-11-22T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.034492 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.034549 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.034495 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:10 crc kubenswrapper[4846]: E1122 09:15:10.034659 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.034732 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:10 crc kubenswrapper[4846]: E1122 09:15:10.034952 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:10 crc kubenswrapper[4846]: E1122 09:15:10.035009 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:10 crc kubenswrapper[4846]: E1122 09:15:10.035161 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.097748 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.097804 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.097816 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.097834 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.097849 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.200477 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.200520 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.200530 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.200546 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.200557 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.303513 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.303572 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.303583 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.303599 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.303612 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.407631 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.407680 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.407691 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.407711 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.407723 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.511273 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.511335 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.511356 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.511385 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.511403 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.613490 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.613553 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.613570 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.613593 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.613607 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.715862 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.715911 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.715924 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.715944 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.715955 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.818151 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.818538 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.818550 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.818568 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.818578 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.921291 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.921333 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.921342 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.921357 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:10 crc kubenswrapper[4846]: I1122 09:15:10.921367 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:10Z","lastTransitionTime":"2025-11-22T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.023387 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.023427 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.023437 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.023452 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.023461 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.125677 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.125723 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.125735 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.125753 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.125765 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.227845 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.227903 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.227913 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.227927 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.227937 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.330948 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.330994 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.331010 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.331029 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.331066 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.434309 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.434372 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.434382 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.434399 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.434409 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.537853 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.537902 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.537911 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.537925 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.537941 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.640794 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.640844 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.640859 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.640878 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.640889 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.743529 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.743666 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.743680 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.743699 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.743711 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.846809 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.846854 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.846867 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.846886 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.846899 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.949974 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.950014 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.950022 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.950078 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:11 crc kubenswrapper[4846]: I1122 09:15:11.950099 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:11Z","lastTransitionTime":"2025-11-22T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.034843 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.034875 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.034900 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.034853 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:12 crc kubenswrapper[4846]: E1122 09:15:12.034991 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:12 crc kubenswrapper[4846]: E1122 09:15:12.035098 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:12 crc kubenswrapper[4846]: E1122 09:15:12.035144 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:12 crc kubenswrapper[4846]: E1122 09:15:12.035181 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.053191 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.053232 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.053259 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.053276 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.053285 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.156560 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.156602 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.156612 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.156644 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.156656 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.260211 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.260271 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.260285 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.260314 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.260329 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.363302 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.363345 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.363356 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.363429 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.363443 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.465370 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.465416 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.465428 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.465449 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.465463 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.567745 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.567810 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.567827 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.567849 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.567864 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.671131 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.671197 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.671212 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.671234 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.671248 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.773970 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.774015 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.774071 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.774105 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.774118 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.877578 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.877634 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.877645 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.877666 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.877680 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.980574 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.980613 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.980622 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.980637 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:12 crc kubenswrapper[4846]: I1122 09:15:12.980648 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:12Z","lastTransitionTime":"2025-11-22T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.083365 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.083415 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.083427 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.083444 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.083454 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:13Z","lastTransitionTime":"2025-11-22T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.187388 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.187565 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.187581 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.187603 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.187618 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:13Z","lastTransitionTime":"2025-11-22T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.290163 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.290210 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.290287 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.290313 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.290332 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:13Z","lastTransitionTime":"2025-11-22T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.393230 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.393291 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.393303 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.393325 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.393339 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:13Z","lastTransitionTime":"2025-11-22T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.459229 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.459285 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.459298 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.459320 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.459336 4846 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T09:15:13Z","lastTransitionTime":"2025-11-22T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.534851 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66"] Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.535446 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.540494 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.540786 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.541173 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.543236 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.585258 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-twrxc" podStartSLOduration=67.585234146 podStartE2EDuration="1m7.585234146s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.563276299 +0000 UTC m=+88.498966008" watchObservedRunningTime="2025-11-22 09:15:13.585234146 +0000 UTC m=+88.520923795" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.657668 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6efa9458-2762-4573-9eee-2faa7c9719f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.657807 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6efa9458-2762-4573-9eee-2faa7c9719f9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.657843 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6efa9458-2762-4573-9eee-2faa7c9719f9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.657911 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6efa9458-2762-4573-9eee-2faa7c9719f9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.657951 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6efa9458-2762-4573-9eee-2faa7c9719f9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.689659 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hbcs8" podStartSLOduration=68.689634858 podStartE2EDuration="1m8.689634858s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.689400842 +0000 UTC m=+88.625090501" watchObservedRunningTime="2025-11-22 09:15:13.689634858 +0000 UTC m=+88.625324517" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.722736 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4h26m" podStartSLOduration=68.722713867 podStartE2EDuration="1m8.722713867s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.72033313 +0000 UTC m=+88.656022789" watchObservedRunningTime="2025-11-22 09:15:13.722713867 +0000 UTC m=+88.658403536" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.739372 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.739348645 podStartE2EDuration="1m8.739348645s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.738568783 +0000 UTC m=+88.674258432" watchObservedRunningTime="2025-11-22 09:15:13.739348645 +0000 UTC m=+88.675038304" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758632 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6efa9458-2762-4573-9eee-2faa7c9719f9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758701 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6efa9458-2762-4573-9eee-2faa7c9719f9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758748 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6efa9458-2762-4573-9eee-2faa7c9719f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758831 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6efa9458-2762-4573-9eee-2faa7c9719f9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758862 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6efa9458-2762-4573-9eee-2faa7c9719f9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758927 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6efa9458-2762-4573-9eee-2faa7c9719f9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.758936 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6efa9458-2762-4573-9eee-2faa7c9719f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.759723 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6efa9458-2762-4573-9eee-2faa7c9719f9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.768652 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6efa9458-2762-4573-9eee-2faa7c9719f9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.772033 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.772019312 podStartE2EDuration="35.772019312s" podCreationTimestamp="2025-11-22 09:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.752456603 +0000 UTC m=+88.688146262" watchObservedRunningTime="2025-11-22 09:15:13.772019312 +0000 UTC m=+88.707708961" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.779115 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6efa9458-2762-4573-9eee-2faa7c9719f9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7zw66\" (UID: \"6efa9458-2762-4573-9eee-2faa7c9719f9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.788939 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podStartSLOduration=68.788915947 podStartE2EDuration="1m8.788915947s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.788568417 +0000 UTC m=+88.724258066" watchObservedRunningTime="2025-11-22 09:15:13.788915947 +0000 UTC m=+88.724605606" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.805250 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-grx77" podStartSLOduration=68.805223175 podStartE2EDuration="1m8.805223175s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.803753883 +0000 UTC m=+88.739443532" watchObservedRunningTime="2025-11-22 09:15:13.805223175 +0000 UTC m=+88.740912824" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.854202 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q52w8" podStartSLOduration=68.854174529 podStartE2EDuration="1m8.854174529s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.853380777 +0000 UTC m=+88.789070426" watchObservedRunningTime="2025-11-22 09:15:13.854174529 +0000 UTC m=+88.789864178" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.858989 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.927488 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=10.927465728 podStartE2EDuration="10.927465728s" podCreationTimestamp="2025-11-22 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.909221735 +0000 UTC m=+88.844911384" watchObservedRunningTime="2025-11-22 09:15:13.927465728 +0000 UTC m=+88.863155387" Nov 22 09:15:13 crc kubenswrapper[4846]: I1122 09:15:13.944030 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.944005072 podStartE2EDuration="1m2.944005072s" podCreationTimestamp="2025-11-22 09:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:13.928540278 +0000 UTC m=+88.864229927" watchObservedRunningTime="2025-11-22 09:15:13.944005072 +0000 UTC m=+88.879694711" Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.034358 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.034362 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:14 crc kubenswrapper[4846]: E1122 09:15:14.034527 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.034398 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.034384 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:14 crc kubenswrapper[4846]: E1122 09:15:14.034733 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:14 crc kubenswrapper[4846]: E1122 09:15:14.034714 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:14 crc kubenswrapper[4846]: E1122 09:15:14.034813 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.048969 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.640357 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" event={"ID":"6efa9458-2762-4573-9eee-2faa7c9719f9","Type":"ContainerStarted","Data":"4fc40ef8e2729438f02840f397ab037b5cc874f0aa582f48e2245c059b67a1fa"} Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.640434 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" event={"ID":"6efa9458-2762-4573-9eee-2faa7c9719f9","Type":"ContainerStarted","Data":"edc61acc2faad24472540ed3e8529796e0895cee0145cfa35d46e0e1e9404781"} Nov 22 09:15:14 crc kubenswrapper[4846]: I1122 09:15:14.660246 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.660218347 podStartE2EDuration="660.218347ms" podCreationTimestamp="2025-11-22 09:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:14.659821336 +0000 UTC m=+89.595510995" watchObservedRunningTime="2025-11-22 09:15:14.660218347 +0000 UTC m=+89.595908006" Nov 22 09:15:16 crc kubenswrapper[4846]: I1122 09:15:16.034875 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:16 crc kubenswrapper[4846]: E1122 09:15:16.035454 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:16 crc kubenswrapper[4846]: I1122 09:15:16.035500 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:16 crc kubenswrapper[4846]: I1122 09:15:16.035610 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:16 crc kubenswrapper[4846]: I1122 09:15:16.035641 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:16 crc kubenswrapper[4846]: E1122 09:15:16.043318 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:16 crc kubenswrapper[4846]: E1122 09:15:16.043766 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:16 crc kubenswrapper[4846]: E1122 09:15:16.044208 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:18 crc kubenswrapper[4846]: I1122 09:15:18.034407 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:18 crc kubenswrapper[4846]: I1122 09:15:18.034421 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:18 crc kubenswrapper[4846]: I1122 09:15:18.034683 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:18 crc kubenswrapper[4846]: I1122 09:15:18.034704 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:18 crc kubenswrapper[4846]: E1122 09:15:18.034841 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:18 crc kubenswrapper[4846]: E1122 09:15:18.034959 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:18 crc kubenswrapper[4846]: I1122 09:15:18.035031 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:15:18 crc kubenswrapper[4846]: E1122 09:15:18.035025 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:18 crc kubenswrapper[4846]: E1122 09:15:18.035131 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:18 crc kubenswrapper[4846]: E1122 09:15:18.035200 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:15:20 crc kubenswrapper[4846]: I1122 09:15:20.034400 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:20 crc kubenswrapper[4846]: I1122 09:15:20.034397 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:20 crc kubenswrapper[4846]: I1122 09:15:20.034397 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:20 crc kubenswrapper[4846]: E1122 09:15:20.034590 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:20 crc kubenswrapper[4846]: I1122 09:15:20.034613 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:20 crc kubenswrapper[4846]: E1122 09:15:20.034696 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:20 crc kubenswrapper[4846]: E1122 09:15:20.034756 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:20 crc kubenswrapper[4846]: E1122 09:15:20.034853 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:22 crc kubenswrapper[4846]: I1122 09:15:22.035067 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:22 crc kubenswrapper[4846]: I1122 09:15:22.035085 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:22 crc kubenswrapper[4846]: I1122 09:15:22.035202 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:22 crc kubenswrapper[4846]: E1122 09:15:22.035340 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:22 crc kubenswrapper[4846]: I1122 09:15:22.035603 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:22 crc kubenswrapper[4846]: E1122 09:15:22.035699 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:22 crc kubenswrapper[4846]: E1122 09:15:22.035799 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:22 crc kubenswrapper[4846]: E1122 09:15:22.035910 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:23 crc kubenswrapper[4846]: I1122 09:15:23.777826 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:23 crc kubenswrapper[4846]: E1122 09:15:23.777980 4846 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:15:23 crc kubenswrapper[4846]: E1122 09:15:23.778080 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs podName:e79bf3c4-87ae-4009-9a11-d26130912fef nodeName:}" failed. No retries permitted until 2025-11-22 09:16:27.778029631 +0000 UTC m=+162.713719280 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs") pod "network-metrics-daemon-79xpm" (UID: "e79bf3c4-87ae-4009-9a11-d26130912fef") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 09:15:24 crc kubenswrapper[4846]: I1122 09:15:24.034418 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:24 crc kubenswrapper[4846]: I1122 09:15:24.034728 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:24 crc kubenswrapper[4846]: E1122 09:15:24.034741 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:24 crc kubenswrapper[4846]: I1122 09:15:24.034914 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:24 crc kubenswrapper[4846]: E1122 09:15:24.035112 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:24 crc kubenswrapper[4846]: I1122 09:15:24.035325 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:24 crc kubenswrapper[4846]: E1122 09:15:24.035334 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:24 crc kubenswrapper[4846]: E1122 09:15:24.035697 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:26 crc kubenswrapper[4846]: I1122 09:15:26.034395 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:26 crc kubenswrapper[4846]: I1122 09:15:26.034459 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:26 crc kubenswrapper[4846]: I1122 09:15:26.034511 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:26 crc kubenswrapper[4846]: I1122 09:15:26.034618 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:26 crc kubenswrapper[4846]: E1122 09:15:26.035573 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:26 crc kubenswrapper[4846]: E1122 09:15:26.036096 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:26 crc kubenswrapper[4846]: E1122 09:15:26.036208 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:26 crc kubenswrapper[4846]: E1122 09:15:26.035989 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:28 crc kubenswrapper[4846]: I1122 09:15:28.035165 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:28 crc kubenswrapper[4846]: I1122 09:15:28.035254 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:28 crc kubenswrapper[4846]: I1122 09:15:28.035334 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:28 crc kubenswrapper[4846]: E1122 09:15:28.035496 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:28 crc kubenswrapper[4846]: I1122 09:15:28.035576 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:28 crc kubenswrapper[4846]: E1122 09:15:28.035654 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:28 crc kubenswrapper[4846]: E1122 09:15:28.035732 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:28 crc kubenswrapper[4846]: E1122 09:15:28.035777 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:30 crc kubenswrapper[4846]: I1122 09:15:30.035147 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:30 crc kubenswrapper[4846]: I1122 09:15:30.035267 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:30 crc kubenswrapper[4846]: I1122 09:15:30.035395 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:30 crc kubenswrapper[4846]: E1122 09:15:30.035412 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:30 crc kubenswrapper[4846]: E1122 09:15:30.035560 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:30 crc kubenswrapper[4846]: E1122 09:15:30.035685 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:30 crc kubenswrapper[4846]: I1122 09:15:30.036402 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:30 crc kubenswrapper[4846]: E1122 09:15:30.036501 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:30 crc kubenswrapper[4846]: I1122 09:15:30.037037 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:15:30 crc kubenswrapper[4846]: E1122 09:15:30.037439 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kws67_openshift-ovn-kubernetes(c874da16-5eda-477e-bbd5-e5c105dc7a07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" Nov 22 09:15:32 crc kubenswrapper[4846]: I1122 09:15:32.035149 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:32 crc kubenswrapper[4846]: I1122 09:15:32.035191 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:32 crc kubenswrapper[4846]: I1122 09:15:32.035170 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:32 crc kubenswrapper[4846]: E1122 09:15:32.035310 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:32 crc kubenswrapper[4846]: E1122 09:15:32.035457 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:32 crc kubenswrapper[4846]: I1122 09:15:32.035567 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:32 crc kubenswrapper[4846]: E1122 09:15:32.035609 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:32 crc kubenswrapper[4846]: E1122 09:15:32.035946 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:34 crc kubenswrapper[4846]: I1122 09:15:34.035030 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:34 crc kubenswrapper[4846]: I1122 09:15:34.035179 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:34 crc kubenswrapper[4846]: I1122 09:15:34.035220 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:34 crc kubenswrapper[4846]: I1122 09:15:34.035231 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:34 crc kubenswrapper[4846]: E1122 09:15:34.035743 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:34 crc kubenswrapper[4846]: E1122 09:15:34.036111 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:34 crc kubenswrapper[4846]: E1122 09:15:34.036030 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:34 crc kubenswrapper[4846]: E1122 09:15:34.036217 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:36 crc kubenswrapper[4846]: I1122 09:15:36.034979 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:36 crc kubenswrapper[4846]: I1122 09:15:36.034978 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:36 crc kubenswrapper[4846]: I1122 09:15:36.035145 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:36 crc kubenswrapper[4846]: I1122 09:15:36.036228 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:36 crc kubenswrapper[4846]: E1122 09:15:36.036414 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:36 crc kubenswrapper[4846]: E1122 09:15:36.036541 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:36 crc kubenswrapper[4846]: E1122 09:15:36.036619 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:36 crc kubenswrapper[4846]: E1122 09:15:36.036697 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:38 crc kubenswrapper[4846]: I1122 09:15:38.034533 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:38 crc kubenswrapper[4846]: I1122 09:15:38.035240 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:38 crc kubenswrapper[4846]: I1122 09:15:38.035334 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:38 crc kubenswrapper[4846]: I1122 09:15:38.035388 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:38 crc kubenswrapper[4846]: E1122 09:15:38.036180 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:38 crc kubenswrapper[4846]: E1122 09:15:38.036334 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:38 crc kubenswrapper[4846]: E1122 09:15:38.036588 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:38 crc kubenswrapper[4846]: E1122 09:15:38.036784 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:40 crc kubenswrapper[4846]: I1122 09:15:40.035028 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:40 crc kubenswrapper[4846]: I1122 09:15:40.035082 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:40 crc kubenswrapper[4846]: I1122 09:15:40.035187 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:40 crc kubenswrapper[4846]: E1122 09:15:40.035332 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:40 crc kubenswrapper[4846]: I1122 09:15:40.035380 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:40 crc kubenswrapper[4846]: E1122 09:15:40.035645 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:40 crc kubenswrapper[4846]: E1122 09:15:40.035683 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:40 crc kubenswrapper[4846]: E1122 09:15:40.035585 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.034686 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.034845 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:42 crc kubenswrapper[4846]: E1122 09:15:42.034892 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:42 crc kubenswrapper[4846]: E1122 09:15:42.035036 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.035431 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:42 crc kubenswrapper[4846]: E1122 09:15:42.035676 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.035888 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:42 crc kubenswrapper[4846]: E1122 09:15:42.036170 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.741601 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/1.log" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.742294 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/0.log" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.742367 4846 generic.go:334] "Generic (PLEG): container finished" podID="9aec6a38-e6e4-4009-95d2-6a179c7fac04" containerID="8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a" exitCode=1 Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.742432 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerDied","Data":"8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a"} Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.742499 4846 scope.go:117] "RemoveContainer" containerID="0c7a847cc7da2c436db81df48b8fc0c56a5e8b4ceab02ae54e04ef663f3e93fd" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.743352 4846 scope.go:117] "RemoveContainer" containerID="8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a" Nov 22 09:15:42 crc kubenswrapper[4846]: E1122 09:15:42.743635 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hbcs8_openshift-multus(9aec6a38-e6e4-4009-95d2-6a179c7fac04)\"" pod="openshift-multus/multus-hbcs8" podUID="9aec6a38-e6e4-4009-95d2-6a179c7fac04" Nov 22 09:15:42 crc kubenswrapper[4846]: I1122 09:15:42.768751 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7zw66" podStartSLOduration=97.768711645 podStartE2EDuration="1m37.768711645s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:14.681526465 +0000 UTC m=+89.617216114" watchObservedRunningTime="2025-11-22 09:15:42.768711645 +0000 UTC m=+117.704401324" Nov 22 09:15:43 crc kubenswrapper[4846]: I1122 09:15:43.748405 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/1.log" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.034326 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.034815 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.034355 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:44 crc kubenswrapper[4846]: E1122 09:15:44.035006 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:44 crc kubenswrapper[4846]: E1122 09:15:44.035075 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.034507 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:44 crc kubenswrapper[4846]: E1122 09:15:44.035185 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.035263 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:15:44 crc kubenswrapper[4846]: E1122 09:15:44.035422 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.754738 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/3.log" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.759120 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerStarted","Data":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.759733 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:15:44 crc kubenswrapper[4846]: I1122 09:15:44.797416 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podStartSLOduration=99.79739066 podStartE2EDuration="1m39.79739066s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:15:44.795283971 +0000 UTC m=+119.730973630" watchObservedRunningTime="2025-11-22 09:15:44.79739066 +0000 UTC m=+119.733080329" Nov 22 09:15:45 crc kubenswrapper[4846]: I1122 09:15:45.167025 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-79xpm"] Nov 22 09:15:45 crc kubenswrapper[4846]: I1122 09:15:45.167189 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:45 crc kubenswrapper[4846]: E1122 09:15:45.167307 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:45 crc kubenswrapper[4846]: E1122 09:15:45.999771 4846 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 22 09:15:46 crc kubenswrapper[4846]: I1122 09:15:46.034513 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:46 crc kubenswrapper[4846]: I1122 09:15:46.034631 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:46 crc kubenswrapper[4846]: E1122 09:15:46.035632 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:46 crc kubenswrapper[4846]: I1122 09:15:46.035735 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:46 crc kubenswrapper[4846]: E1122 09:15:46.035868 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:46 crc kubenswrapper[4846]: E1122 09:15:46.035991 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:46 crc kubenswrapper[4846]: E1122 09:15:46.184291 4846 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 09:15:47 crc kubenswrapper[4846]: I1122 09:15:47.035284 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:47 crc kubenswrapper[4846]: E1122 09:15:47.035507 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:48 crc kubenswrapper[4846]: I1122 09:15:48.034262 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:48 crc kubenswrapper[4846]: I1122 09:15:48.034417 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:48 crc kubenswrapper[4846]: I1122 09:15:48.034541 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:48 crc kubenswrapper[4846]: E1122 09:15:48.034780 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:48 crc kubenswrapper[4846]: E1122 09:15:48.034925 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:48 crc kubenswrapper[4846]: E1122 09:15:48.035210 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:49 crc kubenswrapper[4846]: I1122 09:15:49.034210 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:49 crc kubenswrapper[4846]: E1122 09:15:49.034371 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:50 crc kubenswrapper[4846]: I1122 09:15:50.034430 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:50 crc kubenswrapper[4846]: I1122 09:15:50.034462 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:50 crc kubenswrapper[4846]: I1122 09:15:50.034518 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:50 crc kubenswrapper[4846]: E1122 09:15:50.036185 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:50 crc kubenswrapper[4846]: E1122 09:15:50.036605 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:50 crc kubenswrapper[4846]: E1122 09:15:50.036524 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:51 crc kubenswrapper[4846]: I1122 09:15:51.035132 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:51 crc kubenswrapper[4846]: E1122 09:15:51.035610 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:51 crc kubenswrapper[4846]: E1122 09:15:51.186138 4846 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 09:15:52 crc kubenswrapper[4846]: I1122 09:15:52.034619 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:52 crc kubenswrapper[4846]: I1122 09:15:52.034728 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:52 crc kubenswrapper[4846]: E1122 09:15:52.034819 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:52 crc kubenswrapper[4846]: E1122 09:15:52.034920 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:52 crc kubenswrapper[4846]: I1122 09:15:52.035114 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:52 crc kubenswrapper[4846]: E1122 09:15:52.035331 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:53 crc kubenswrapper[4846]: I1122 09:15:53.035001 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:53 crc kubenswrapper[4846]: E1122 09:15:53.035681 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:54 crc kubenswrapper[4846]: I1122 09:15:54.034395 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:54 crc kubenswrapper[4846]: I1122 09:15:54.034481 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:54 crc kubenswrapper[4846]: I1122 09:15:54.034387 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:54 crc kubenswrapper[4846]: E1122 09:15:54.034624 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:54 crc kubenswrapper[4846]: E1122 09:15:54.034788 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:54 crc kubenswrapper[4846]: E1122 09:15:54.035078 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:55 crc kubenswrapper[4846]: I1122 09:15:55.034622 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:55 crc kubenswrapper[4846]: E1122 09:15:55.035011 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:55 crc kubenswrapper[4846]: I1122 09:15:55.035616 4846 scope.go:117] "RemoveContainer" containerID="8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a" Nov 22 09:15:55 crc kubenswrapper[4846]: I1122 09:15:55.805360 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/1.log" Nov 22 09:15:55 crc kubenswrapper[4846]: I1122 09:15:55.805433 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerStarted","Data":"2c9ecafae6b69b17dbedcb7f5d9e0c34ac261a6452f0276112bc86f9662471e7"} Nov 22 09:15:56 crc kubenswrapper[4846]: I1122 09:15:56.034744 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:56 crc kubenswrapper[4846]: I1122 09:15:56.034806 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:56 crc kubenswrapper[4846]: I1122 09:15:56.034920 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:56 crc kubenswrapper[4846]: E1122 09:15:56.036870 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:56 crc kubenswrapper[4846]: E1122 09:15:56.037157 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:56 crc kubenswrapper[4846]: E1122 09:15:56.037277 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:56 crc kubenswrapper[4846]: E1122 09:15:56.187116 4846 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 09:15:57 crc kubenswrapper[4846]: I1122 09:15:57.034423 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:57 crc kubenswrapper[4846]: E1122 09:15:57.034572 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:15:58 crc kubenswrapper[4846]: I1122 09:15:58.034660 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:15:58 crc kubenswrapper[4846]: I1122 09:15:58.034752 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:15:58 crc kubenswrapper[4846]: I1122 09:15:58.034676 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:15:58 crc kubenswrapper[4846]: E1122 09:15:58.034886 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:15:58 crc kubenswrapper[4846]: E1122 09:15:58.035086 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:15:58 crc kubenswrapper[4846]: E1122 09:15:58.035199 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:15:59 crc kubenswrapper[4846]: I1122 09:15:59.034339 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:15:59 crc kubenswrapper[4846]: E1122 09:15:59.034620 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:16:00 crc kubenswrapper[4846]: I1122 09:16:00.034778 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:00 crc kubenswrapper[4846]: I1122 09:16:00.034778 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:00 crc kubenswrapper[4846]: E1122 09:16:00.035249 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 09:16:00 crc kubenswrapper[4846]: I1122 09:16:00.035329 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:16:00 crc kubenswrapper[4846]: E1122 09:16:00.035855 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 09:16:00 crc kubenswrapper[4846]: E1122 09:16:00.036354 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 09:16:01 crc kubenswrapper[4846]: I1122 09:16:01.034340 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:16:01 crc kubenswrapper[4846]: E1122 09:16:01.034535 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-79xpm" podUID="e79bf3c4-87ae-4009-9a11-d26130912fef" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.034831 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.034912 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.035006 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.038481 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.038841 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.038925 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 09:16:02 crc kubenswrapper[4846]: I1122 09:16:02.039503 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 09:16:03 crc kubenswrapper[4846]: I1122 09:16:03.034245 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:16:03 crc kubenswrapper[4846]: I1122 09:16:03.037327 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 22 09:16:03 crc kubenswrapper[4846]: I1122 09:16:03.037421 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.272115 4846 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.318463 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.319035 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2fp62"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.319272 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.320724 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.321801 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.321987 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.324226 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.324814 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xxpch"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.325319 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.325683 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.326664 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.326899 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.326997 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.327070 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zdpb4"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.327383 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.327608 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.327947 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.330400 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wgnrq"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.331276 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n2tpg"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.331458 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.332027 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.332158 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.332375 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.332523 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.333254 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.333349 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.333435 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.334021 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.334198 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.334330 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.339761 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.340744 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.341139 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.343121 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.344133 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v92tx"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.344571 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.350031 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.354003 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.380082 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.380298 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.380681 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.380911 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.381111 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.381242 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.381335 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.381682 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.381957 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.382538 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.383111 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.383335 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-k86mj"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.383405 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.384380 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.384943 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.384993 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385039 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385083 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385151 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385179 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385220 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385238 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385308 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385315 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385319 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385387 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385456 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.384994 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.385974 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.387492 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.389480 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.392057 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.392251 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.392447 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.392634 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.396671 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.396932 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.397871 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.398061 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.398234 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.398392 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.398783 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.399163 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.403132 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.403668 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.404528 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.407347 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.407605 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.408499 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.408997 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-n5xpn"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.409298 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lz8p8"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.409721 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.410282 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.410487 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.416004 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.416117 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.416377 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.416646 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-49tss"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.416835 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.417283 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.417820 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.419133 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tzhcp"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.419614 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.417291 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.419706 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.420312 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.420744 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.421225 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp6pf\" (UniqueName: \"kubernetes.io/projected/98d61b3e-6191-4d14-823a-f791ddf65cae-kube-api-access-dp6pf\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.421256 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.421276 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.421297 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.421318 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-trusted-ca\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432335 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-etcd-client\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432447 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-serving-cert\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432500 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432549 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20741111-12b6-4d66-9743-c51d0b8a1a5b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432581 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432624 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b6cce2-f501-46bb-af41-3933baf3205c-config\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432668 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432721 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-config\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432764 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432795 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432828 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5e681f-ca95-4ba0-935e-86f18702cf78-serving-cert\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432921 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23993c93-7c44-4d7c-8758-1cf3666212a5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432960 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-audit\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.432992 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433024 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-encryption-config\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433086 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8mlh\" (UniqueName: \"kubernetes.io/projected/c9204723-54c5-457c-8bb8-58be85f199e2-kube-api-access-f8mlh\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433284 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-service-ca\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433329 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f79231-ace1-41b1-ae23-f812f404fb67-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433362 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghck6\" (UniqueName: \"kubernetes.io/projected/f0f79231-ace1-41b1-ae23-f812f404fb67-kube-api-access-ghck6\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433531 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-service-ca-bundle\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433568 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzmt\" (UniqueName: \"kubernetes.io/projected/23993c93-7c44-4d7c-8758-1cf3666212a5-kube-api-access-mgzmt\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433665 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22bb9a30-7380-4482-b556-57bed8a7d681-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433717 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfvd\" (UniqueName: \"kubernetes.io/projected/4b0be692-d108-4051-9a33-6529b4ed1e7b-kube-api-access-wlfvd\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433745 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kv8b\" (UniqueName: \"kubernetes.io/projected/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-kube-api-access-6kv8b\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433830 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-config\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.433894 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-client-ca\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434021 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k626n\" (UniqueName: \"kubernetes.io/projected/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-kube-api-access-k626n\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434187 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-config\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434234 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-auth-proxy-config\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434545 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ttp4\" (UniqueName: \"kubernetes.io/projected/09e9b382-4c2b-440b-978b-3aab0494d892-kube-api-access-9ttp4\") pod \"dns-operator-744455d44c-n2tpg\" (UID: \"09e9b382-4c2b-440b-978b-3aab0494d892\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434591 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-oauth-serving-cert\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434696 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-serving-cert\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434798 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23993c93-7c44-4d7c-8758-1cf3666212a5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434841 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.434878 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfmdj\" (UniqueName: \"kubernetes.io/projected/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-kube-api-access-cfmdj\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435085 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435262 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-etcd-client\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435342 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-serving-cert\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435585 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-audit-policies\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435739 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-config\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435801 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-node-pullsecrets\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.435998 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-encryption-config\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.436135 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/22bb9a30-7380-4482-b556-57bed8a7d681-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.437991 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-oauth-config\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.438072 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-audit-dir\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.438192 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-client\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.438358 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-image-import-ca\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.438825 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8418012a-eb36-472f-a4ea-49d3af8dbd09-serving-cert\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.439267 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4b6cce2-f501-46bb-af41-3933baf3205c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.439493 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09e9b382-4c2b-440b-978b-3aab0494d892-metrics-tls\") pod \"dns-operator-744455d44c-n2tpg\" (UID: \"09e9b382-4c2b-440b-978b-3aab0494d892\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.439620 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-dir\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.440088 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.440375 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-772mk\" (UniqueName: \"kubernetes.io/projected/8418012a-eb36-472f-a4ea-49d3af8dbd09-kube-api-access-772mk\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441488 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.440855 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87xqz\" (UniqueName: \"kubernetes.io/projected/22bb9a30-7380-4482-b556-57bed8a7d681-kube-api-access-87xqz\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441681 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441736 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-config\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441774 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441796 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sddbs\" (UniqueName: \"kubernetes.io/projected/fd5e681f-ca95-4ba0-935e-86f18702cf78-kube-api-access-sddbs\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441905 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-ca\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.441971 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp59c\" (UniqueName: \"kubernetes.io/projected/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-kube-api-access-qp59c\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.442029 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.442254 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.442638 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.444165 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.457410 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6wc\" (UniqueName: \"kubernetes.io/projected/20741111-12b6-4d66-9743-c51d0b8a1a5b-kube-api-access-ls6wc\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.457535 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-config\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.463132 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.464320 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.464404 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.464563 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.464688 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.464759 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.464938 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465066 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465127 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465292 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465349 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465487 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465536 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465367 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465696 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465755 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.465293 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.466497 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.466627 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.466774 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.467107 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.467277 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.474726 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475189 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475219 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475346 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475452 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475511 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475397 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475682 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.475948 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476383 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476433 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-config\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476553 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d61b3e-6191-4d14-823a-f791ddf65cae-serving-cert\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476614 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-config\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476650 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/23993c93-7c44-4d7c-8758-1cf3666212a5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476709 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476743 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-images\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476769 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gtnb\" (UniqueName: \"kubernetes.io/projected/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-kube-api-access-7gtnb\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476846 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-machine-approver-tls\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476853 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.476877 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478159 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-etcd-serving-ca\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478202 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-console-config\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478238 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-client-ca\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478300 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b6cce2-f501-46bb-af41-3933baf3205c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478343 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrmh9\" (UniqueName: \"kubernetes.io/projected/f23592b0-b045-4aa5-a22f-c15133890ed4-kube-api-access-qrmh9\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478414 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478441 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-audit-dir\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478468 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qnq\" (UniqueName: \"kubernetes.io/projected/6cc4154d-473a-46cf-acf2-6978d0e642ee-kube-api-access-24qnq\") pod \"downloads-7954f5f757-n5xpn\" (UID: \"6cc4154d-473a-46cf-acf2-6978d0e642ee\") " pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478371 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478680 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-trusted-ca-bundle\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478734 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f79231-ace1-41b1-ae23-f812f404fb67-config\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.478805 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9204723-54c5-457c-8bb8-58be85f199e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479012 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479086 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ee64233-aae3-4a4a-815d-ca55dd93bfb4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4xjfk\" (UID: \"9ee64233-aae3-4a4a-815d-ca55dd93bfb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479109 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcxd2\" (UniqueName: \"kubernetes.io/projected/9ee64233-aae3-4a4a-815d-ca55dd93bfb4-kube-api-access-rcxd2\") pod \"cluster-samples-operator-665b6dd947-4xjfk\" (UID: \"9ee64233-aae3-4a4a-815d-ca55dd93bfb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479144 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-policies\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479165 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-service-ca\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479198 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20741111-12b6-4d66-9743-c51d0b8a1a5b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479218 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-serving-cert\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.479218 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.484578 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nfrf8"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.485538 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.486528 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.486932 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.487283 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.487443 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.496153 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.497145 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hmwp7"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.497434 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.498323 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.498812 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6kzvm"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.499426 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.499652 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.499854 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.511395 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.512280 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.520648 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.521738 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.526215 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.527037 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lgbpn"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.527578 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.527924 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.528974 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.529690 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-slgpd"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.529966 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.531325 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.531825 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.532037 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.532467 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.533512 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.535357 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.536360 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.536903 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.537255 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.537556 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mhwdb"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.538608 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.538772 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.542547 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xxpch"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.542607 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.542617 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k86mj"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.553818 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nfrf8"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.555984 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.556363 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.556932 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.561719 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.567646 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.570599 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zdpb4"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.571563 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2fp62"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.574657 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wgnrq"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.577560 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.577974 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.578180 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.578211 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580131 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580179 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-772mk\" (UniqueName: \"kubernetes.io/projected/8418012a-eb36-472f-a4ea-49d3af8dbd09-kube-api-access-772mk\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580202 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87xqz\" (UniqueName: \"kubernetes.io/projected/22bb9a30-7380-4482-b556-57bed8a7d681-kube-api-access-87xqz\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580222 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580247 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-config\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580268 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580288 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sddbs\" (UniqueName: \"kubernetes.io/projected/fd5e681f-ca95-4ba0-935e-86f18702cf78-kube-api-access-sddbs\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580310 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-ca\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580334 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp59c\" (UniqueName: \"kubernetes.io/projected/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-kube-api-access-qp59c\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580352 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6wc\" (UniqueName: \"kubernetes.io/projected/20741111-12b6-4d66-9743-c51d0b8a1a5b-kube-api-access-ls6wc\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580376 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-config\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580397 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-config\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580416 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d61b3e-6191-4d14-823a-f791ddf65cae-serving-cert\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580436 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-config\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580454 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/23993c93-7c44-4d7c-8758-1cf3666212a5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580474 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580497 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-images\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.580518 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gtnb\" (UniqueName: \"kubernetes.io/projected/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-kube-api-access-7gtnb\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581180 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-machine-approver-tls\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581464 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-etcd-serving-ca\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581488 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-console-config\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581511 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-client-ca\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581543 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b6cce2-f501-46bb-af41-3933baf3205c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581578 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrmh9\" (UniqueName: \"kubernetes.io/projected/f23592b0-b045-4aa5-a22f-c15133890ed4-kube-api-access-qrmh9\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581610 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581635 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-audit-dir\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581660 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qnq\" (UniqueName: \"kubernetes.io/projected/6cc4154d-473a-46cf-acf2-6978d0e642ee-kube-api-access-24qnq\") pod \"downloads-7954f5f757-n5xpn\" (UID: \"6cc4154d-473a-46cf-acf2-6978d0e642ee\") " pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581698 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-trusted-ca-bundle\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581720 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f79231-ace1-41b1-ae23-f812f404fb67-config\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581755 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9204723-54c5-457c-8bb8-58be85f199e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581783 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581808 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ee64233-aae3-4a4a-815d-ca55dd93bfb4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4xjfk\" (UID: \"9ee64233-aae3-4a4a-815d-ca55dd93bfb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581832 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcxd2\" (UniqueName: \"kubernetes.io/projected/9ee64233-aae3-4a4a-815d-ca55dd93bfb4-kube-api-access-rcxd2\") pod \"cluster-samples-operator-665b6dd947-4xjfk\" (UID: \"9ee64233-aae3-4a4a-815d-ca55dd93bfb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.581859 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-policies\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.582003 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-service-ca\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.582035 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20741111-12b6-4d66-9743-c51d0b8a1a5b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.582101 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-serving-cert\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.582125 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp6pf\" (UniqueName: \"kubernetes.io/projected/98d61b3e-6191-4d14-823a-f791ddf65cae-kube-api-access-dp6pf\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.582233 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.582264 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583024 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583129 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-trusted-ca\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583172 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-etcd-client\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583224 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-serving-cert\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583256 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583286 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20741111-12b6-4d66-9743-c51d0b8a1a5b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583316 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583342 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b6cce2-f501-46bb-af41-3933baf3205c-config\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583365 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583386 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-config\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583412 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583422 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-images\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583433 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583515 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5e681f-ca95-4ba0-935e-86f18702cf78-serving-cert\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583575 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23993c93-7c44-4d7c-8758-1cf3666212a5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583597 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-audit\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583658 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583690 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-encryption-config\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583753 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mlh\" (UniqueName: \"kubernetes.io/projected/c9204723-54c5-457c-8bb8-58be85f199e2-kube-api-access-f8mlh\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583895 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-service-ca\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583917 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f79231-ace1-41b1-ae23-f812f404fb67-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.583950 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghck6\" (UniqueName: \"kubernetes.io/projected/f0f79231-ace1-41b1-ae23-f812f404fb67-kube-api-access-ghck6\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.584133 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585182 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-service-ca-bundle\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585300 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgzmt\" (UniqueName: \"kubernetes.io/projected/23993c93-7c44-4d7c-8758-1cf3666212a5-kube-api-access-mgzmt\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585329 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22bb9a30-7380-4482-b556-57bed8a7d681-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585382 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfvd\" (UniqueName: \"kubernetes.io/projected/4b0be692-d108-4051-9a33-6529b4ed1e7b-kube-api-access-wlfvd\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585423 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kv8b\" (UniqueName: \"kubernetes.io/projected/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-kube-api-access-6kv8b\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585486 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-config\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585514 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-client-ca\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585546 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k626n\" (UniqueName: \"kubernetes.io/projected/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-kube-api-access-k626n\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585615 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-config\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585655 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-auth-proxy-config\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585685 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ttp4\" (UniqueName: \"kubernetes.io/projected/09e9b382-4c2b-440b-978b-3aab0494d892-kube-api-access-9ttp4\") pod \"dns-operator-744455d44c-n2tpg\" (UID: \"09e9b382-4c2b-440b-978b-3aab0494d892\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585715 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-oauth-serving-cert\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585764 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-serving-cert\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585795 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23993c93-7c44-4d7c-8758-1cf3666212a5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585821 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585866 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfmdj\" (UniqueName: \"kubernetes.io/projected/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-kube-api-access-cfmdj\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585893 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585920 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-etcd-client\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585968 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-serving-cert\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-audit-policies\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.585983 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-service-ca-bundle\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586032 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-config\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586065 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-config\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586075 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-node-pullsecrets\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586111 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-encryption-config\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586145 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/22bb9a30-7380-4482-b556-57bed8a7d681-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586171 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-oauth-config\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586196 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-audit-dir\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586219 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-client\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586629 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.586993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-image-import-ca\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.587433 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxzvp\" (UniqueName: \"kubernetes.io/projected/9f953d76-d324-4923-8767-534c7fec6648-kube-api-access-sxzvp\") pod \"migrator-59844c95c7-268n8\" (UID: \"9f953d76-d324-4923-8767-534c7fec6648\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.587563 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8418012a-eb36-472f-a4ea-49d3af8dbd09-serving-cert\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.594443 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-policies\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.595708 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-trusted-ca\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.595940 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-config\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.596332 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.596778 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.599761 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.603884 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.604498 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d61b3e-6191-4d14-823a-f791ddf65cae-config\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.604637 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20741111-12b6-4d66-9743-c51d0b8a1a5b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.605223 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-client-ca\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.605470 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-config\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.606132 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-auth-proxy-config\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.606871 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-oauth-serving-cert\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.610353 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.611013 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-serving-cert\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.616670 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-etcd-client\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.617117 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-serving-cert\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.617235 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-node-pullsecrets\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.617558 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-audit-policies\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.618709 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/22bb9a30-7380-4482-b556-57bed8a7d681-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.618967 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-audit-dir\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.619407 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.620019 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-image-import-ca\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.619884 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-config\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.621031 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-config\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.621184 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23993c93-7c44-4d7c-8758-1cf3666212a5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.621248 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.621799 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.622839 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-audit\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.623081 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-config\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.623075 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-serving-cert\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.623464 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.624268 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-etcd-serving-ca\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.624276 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-service-ca\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.625276 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-client-ca\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.625953 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0f79231-ace1-41b1-ae23-f812f404fb67-config\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.626773 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-encryption-config\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.627350 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-console-config\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.627432 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-audit-dir\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.627605 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-trusted-ca-bundle\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.627692 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.627849 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v92tx"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.628729 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20741111-12b6-4d66-9743-c51d0b8a1a5b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.631886 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d61b3e-6191-4d14-823a-f791ddf65cae-serving-cert\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.632819 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.632916 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.633030 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.588089 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4b6cce2-f501-46bb-af41-3933baf3205c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.633128 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09e9b382-4c2b-440b-978b-3aab0494d892-metrics-tls\") pod \"dns-operator-744455d44c-n2tpg\" (UID: \"09e9b382-4c2b-440b-978b-3aab0494d892\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.633154 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-dir\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.633289 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-dir\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.634580 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.634580 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.634693 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.635089 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.635973 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22bb9a30-7380-4482-b556-57bed8a7d681-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.636094 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-oauth-config\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.636728 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09e9b382-4c2b-440b-978b-3aab0494d892-metrics-tls\") pod \"dns-operator-744455d44c-n2tpg\" (UID: \"09e9b382-4c2b-440b-978b-3aab0494d892\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.637313 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-serving-cert\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.642409 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.643871 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.645097 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wd5kx"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.646680 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.647142 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-nrm4s"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.647652 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b6cce2-f501-46bb-af41-3933baf3205c-config\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.648461 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.648598 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.649253 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-client\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.649321 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.650292 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n2tpg"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.651384 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.652482 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-49tss"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.654403 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6kzvm"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.655518 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f79231-ace1-41b1-ae23-f812f404fb67-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.655517 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b6cce2-f501-46bb-af41-3933baf3205c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.655766 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.655902 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lz8p8"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.655868 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-encryption-config\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.655926 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-machine-approver-tls\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.656109 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/23993c93-7c44-4d7c-8758-1cf3666212a5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.656163 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-etcd-client\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.656247 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.656300 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5e681f-ca95-4ba0-935e-86f18702cf78-serving-cert\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.657007 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9204723-54c5-457c-8bb8-58be85f199e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.657083 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-n5xpn"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.657838 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.657967 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ee64233-aae3-4a4a-815d-ca55dd93bfb4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4xjfk\" (UID: \"9ee64233-aae3-4a4a-815d-ca55dd93bfb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.658530 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.665409 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.666562 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.668220 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.668223 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-config\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.670994 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hmwp7"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.671020 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.672390 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-slgpd"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.673241 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.677018 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.677938 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.680168 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wd5kx"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.683445 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lgbpn"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.684067 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.684816 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mhwdb"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.686234 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.688002 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.689825 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.693371 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tfpk7"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.695421 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tfpk7"] Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.695541 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.704096 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.723791 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.733554 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-ca\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.734416 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxzvp\" (UniqueName: \"kubernetes.io/projected/9f953d76-d324-4923-8767-534c7fec6648-kube-api-access-sxzvp\") pod \"migrator-59844c95c7-268n8\" (UID: \"9f953d76-d324-4923-8767-534c7fec6648\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.744262 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.763577 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.786964 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.805629 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.819295 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8418012a-eb36-472f-a4ea-49d3af8dbd09-serving-cert\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.823708 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.844143 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.864110 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.883912 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.903880 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.909759 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8418012a-eb36-472f-a4ea-49d3af8dbd09-etcd-service-ca\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.944933 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.964826 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.975604 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:16:04 crc kubenswrapper[4846]: I1122 09:16:04.985598 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.004469 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.024122 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.044140 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.063145 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.084368 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.104186 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.124784 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.144951 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.164314 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.183910 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.203941 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.225180 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.245207 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.265952 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.284008 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.305524 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.324873 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.344712 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.364276 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.384276 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.404917 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.425082 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.452646 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.464680 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.485164 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.502266 4846 request.go:700] Waited for 1.001972051s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.505215 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.543818 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.564950 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.584745 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.612908 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.624845 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.645264 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.665575 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.684633 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.705010 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.724240 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.744428 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.764622 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.784034 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.805944 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.824361 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.844320 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.865281 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.884330 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.904413 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.924113 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.944124 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.964796 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 22 09:16:05 crc kubenswrapper[4846]: I1122 09:16:05.984841 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.003957 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.024125 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.044725 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.064033 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.083372 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.103749 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.123454 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.143480 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.164604 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.200933 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87xqz\" (UniqueName: \"kubernetes.io/projected/22bb9a30-7380-4482-b556-57bed8a7d681-kube-api-access-87xqz\") pod \"openshift-config-operator-7777fb866f-mzg5p\" (UID: \"22bb9a30-7380-4482-b556-57bed8a7d681\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.224375 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp59c\" (UniqueName: \"kubernetes.io/projected/d61b0632-6ae3-43fa-b1d5-ebe9671be6cd-kube-api-access-qp59c\") pod \"machine-approver-56656f9798-g4frl\" (UID: \"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.250463 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-772mk\" (UniqueName: \"kubernetes.io/projected/8418012a-eb36-472f-a4ea-49d3af8dbd09-kube-api-access-772mk\") pod \"etcd-operator-b45778765-49tss\" (UID: \"8418012a-eb36-472f-a4ea-49d3af8dbd09\") " pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.266337 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgzmt\" (UniqueName: \"kubernetes.io/projected/23993c93-7c44-4d7c-8758-1cf3666212a5-kube-api-access-mgzmt\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.280635 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfvd\" (UniqueName: \"kubernetes.io/projected/4b0be692-d108-4051-9a33-6529b4ed1e7b-kube-api-access-wlfvd\") pod \"oauth-openshift-558db77b4-2fp62\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.299111 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6wc\" (UniqueName: \"kubernetes.io/projected/20741111-12b6-4d66-9743-c51d0b8a1a5b-kube-api-access-ls6wc\") pod \"openshift-controller-manager-operator-756b6f6bc6-zgtrh\" (UID: \"20741111-12b6-4d66-9743-c51d0b8a1a5b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.318030 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sddbs\" (UniqueName: \"kubernetes.io/projected/fd5e681f-ca95-4ba0-935e-86f18702cf78-kube-api-access-sddbs\") pod \"controller-manager-879f6c89f-xxpch\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.343689 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghck6\" (UniqueName: \"kubernetes.io/projected/f0f79231-ace1-41b1-ae23-f812f404fb67-kube-api-access-ghck6\") pod \"openshift-apiserver-operator-796bbdcf4f-szfmb\" (UID: \"f0f79231-ace1-41b1-ae23-f812f404fb67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.357469 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kv8b\" (UniqueName: \"kubernetes.io/projected/01e5ec75-28e3-4baa-8501-cbe8c740ec3f-kube-api-access-6kv8b\") pod \"machine-api-operator-5694c8668f-lz8p8\" (UID: \"01e5ec75-28e3-4baa-8501-cbe8c740ec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.400534 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k626n\" (UniqueName: \"kubernetes.io/projected/b961dfe4-8e3d-4cf8-8032-2293ea7240fe-kube-api-access-k626n\") pod \"apiserver-76f77b778f-wgnrq\" (UID: \"b961dfe4-8e3d-4cf8-8032-2293ea7240fe\") " pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.405974 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ttp4\" (UniqueName: \"kubernetes.io/projected/09e9b382-4c2b-440b-978b-3aab0494d892-kube-api-access-9ttp4\") pod \"dns-operator-744455d44c-n2tpg\" (UID: \"09e9b382-4c2b-440b-978b-3aab0494d892\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.422609 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23993c93-7c44-4d7c-8758-1cf3666212a5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h9qpp\" (UID: \"23993c93-7c44-4d7c-8758-1cf3666212a5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.441954 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfmdj\" (UniqueName: \"kubernetes.io/projected/ddb3fe43-72d0-41e4-871e-0fa81e7a52a3-kube-api-access-cfmdj\") pod \"apiserver-7bbb656c7d-zxtgf\" (UID: \"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.446535 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.454957 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.459451 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.461545 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mlh\" (UniqueName: \"kubernetes.io/projected/c9204723-54c5-457c-8bb8-58be85f199e2-kube-api-access-f8mlh\") pod \"route-controller-manager-6576b87f9c-vmbgx\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.479486 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.481484 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.483260 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gtnb\" (UniqueName: \"kubernetes.io/projected/42fec650-4eb0-4cb8-adf9-acaebf0ba09e-kube-api-access-7gtnb\") pod \"console-operator-58897d9998-zdpb4\" (UID: \"42fec650-4eb0-4cb8-adf9-acaebf0ba09e\") " pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.490651 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.503618 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrmh9\" (UniqueName: \"kubernetes.io/projected/f23592b0-b045-4aa5-a22f-c15133890ed4-kube-api-access-qrmh9\") pod \"console-f9d7485db-k86mj\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.520444 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcxd2\" (UniqueName: \"kubernetes.io/projected/9ee64233-aae3-4a4a-815d-ca55dd93bfb4-kube-api-access-rcxd2\") pod \"cluster-samples-operator-665b6dd947-4xjfk\" (UID: \"9ee64233-aae3-4a4a-815d-ca55dd93bfb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.521978 4846 request.go:700] Waited for 1.894152339s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.529644 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.538946 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qnq\" (UniqueName: \"kubernetes.io/projected/6cc4154d-473a-46cf-acf2-6978d0e642ee-kube-api-access-24qnq\") pod \"downloads-7954f5f757-n5xpn\" (UID: \"6cc4154d-473a-46cf-acf2-6978d0e642ee\") " pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.564965 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp6pf\" (UniqueName: \"kubernetes.io/projected/98d61b3e-6191-4d14-823a-f791ddf65cae-kube-api-access-dp6pf\") pod \"authentication-operator-69f744f599-v92tx\" (UID: \"98d61b3e-6191-4d14-823a-f791ddf65cae\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.575355 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.579673 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.584236 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.592083 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4b6cce2-f501-46bb-af41-3933baf3205c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l7j2f\" (UID: \"e4b6cce2-f501-46bb-af41-3933baf3205c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.595030 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.606168 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.609296 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.627401 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.629191 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.645289 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.657353 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.664813 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.666395 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.673809 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.683684 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.685746 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.708139 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.724522 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.744277 4846 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.763902 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.770300 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.794715 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp"] Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.803955 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.804804 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxzvp\" (UniqueName: \"kubernetes.io/projected/9f953d76-d324-4923-8767-534c7fec6648-kube-api-access-sxzvp\") pod \"migrator-59844c95c7-268n8\" (UID: \"9f953d76-d324-4923-8767-534c7fec6648\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.814899 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.871828 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872435 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd121269-e390-4baf-bf26-24c5fd4dac70-config\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872499 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1064a42-3995-4b2a-844d-6b2ea290d5c8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872625 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ab0befc-0749-4a94-9d57-adc79f211e9d-images\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872655 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95298591-2816-41fe-8c02-5d15ea156d80-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872675 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872700 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqlw\" (UniqueName: \"kubernetes.io/projected/95298591-2816-41fe-8c02-5d15ea156d80-kube-api-access-5kqlw\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872724 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1064a42-3995-4b2a-844d-6b2ea290d5c8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872748 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872788 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-proxy-tls\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872835 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htjkk\" (UniqueName: \"kubernetes.io/projected/467cf0eb-8a51-4268-b3a9-b308a52aed81-kube-api-access-htjkk\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872854 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzmn\" (UniqueName: \"kubernetes.io/projected/c7a523a9-8ee9-4bab-8baa-ac393331dd07-kube-api-access-tgzmn\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872922 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpcb\" (UniqueName: \"kubernetes.io/projected/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-kube-api-access-qkpcb\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872959 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd121269-e390-4baf-bf26-24c5fd4dac70-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.872980 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7a523a9-8ee9-4bab-8baa-ac393331dd07-service-ca-bundle\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873058 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4225b942-8fc1-4b47-906a-f443ddc4aab4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6kzvm\" (UID: \"4225b942-8fc1-4b47-906a-f443ddc4aab4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873117 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-544pr\" (UniqueName: \"kubernetes.io/projected/7ab0befc-0749-4a94-9d57-adc79f211e9d-kube-api-access-544pr\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873157 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ab0befc-0749-4a94-9d57-adc79f211e9d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873328 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-registry-certificates\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873414 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee403130-f909-4216-a9ff-8a4cb41d4017-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b95xr\" (UID: \"ee403130-f909-4216-a9ff-8a4cb41d4017\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873450 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd121269-e390-4baf-bf26-24c5fd4dac70-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873501 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-default-certificate\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873545 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873574 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-registry-tls\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873715 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ab0befc-0749-4a94-9d57-adc79f211e9d-proxy-tls\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873861 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-kube-api-access-p5482\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873892 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-trusted-ca\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873927 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ml6\" (UniqueName: \"kubernetes.io/projected/4225b942-8fc1-4b47-906a-f443ddc4aab4-kube-api-access-t5ml6\") pod \"multus-admission-controller-857f4d67dd-6kzvm\" (UID: \"4225b942-8fc1-4b47-906a-f443ddc4aab4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.873990 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/467cf0eb-8a51-4268-b3a9-b308a52aed81-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.874127 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1064a42-3995-4b2a-844d-6b2ea290d5c8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.874153 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-metrics-certs\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.874200 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/467cf0eb-8a51-4268-b3a9-b308a52aed81-metrics-tls\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.874218 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4pwt\" (UniqueName: \"kubernetes.io/projected/750ea675-e79a-459b-8261-e15dd252a8f1-kube-api-access-s4pwt\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.874233 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/467cf0eb-8a51-4268-b3a9-b308a52aed81-trusted-ca\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.875561 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92822bda-884a-4bfc-b651-f58624599346-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.877744 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.877801 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95298591-2816-41fe-8c02-5d15ea156d80-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.877917 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92822bda-884a-4bfc-b651-f58624599346-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.877951 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-stats-auth\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.877981 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25zl\" (UniqueName: \"kubernetes.io/projected/ee403130-f909-4216-a9ff-8a4cb41d4017-kube-api-access-q25zl\") pod \"control-plane-machine-set-operator-78cbb6b69f-b95xr\" (UID: \"ee403130-f909-4216-a9ff-8a4cb41d4017\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.878004 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-bound-sa-token\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.880440 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" event={"ID":"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd","Type":"ContainerStarted","Data":"2fec07064090255285e3c51a1a90435fce813b88c2393552b357a906223e4fb9"} Nov 22 09:16:06 crc kubenswrapper[4846]: E1122 09:16:06.889417 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.38939579 +0000 UTC m=+142.325085439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.891007 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb"] Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.978745 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:06 crc kubenswrapper[4846]: E1122 09:16:06.978936 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.478909961 +0000 UTC m=+142.414599610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979035 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bacce7c-168f-4813-a111-58d7e0228cd5-serving-cert\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979120 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q25zl\" (UniqueName: \"kubernetes.io/projected/ee403130-f909-4216-a9ff-8a4cb41d4017-kube-api-access-q25zl\") pod \"control-plane-machine-set-operator-78cbb6b69f-b95xr\" (UID: \"ee403130-f909-4216-a9ff-8a4cb41d4017\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979154 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-bound-sa-token\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979179 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69e8f812-0fb1-406e-93d0-77093b6344fc-metrics-tls\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979218 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd121269-e390-4baf-bf26-24c5fd4dac70-config\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979244 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7797f70-8531-42a8-906f-b16e97b9aabc-webhook-cert\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979348 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1064a42-3995-4b2a-844d-6b2ea290d5c8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.979435 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9503768e-24bc-4280-ba41-a96116f9523e-signing-cabundle\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980099 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/71a2cf81-3a3f-4bc9-8b67-57ad66576390-node-bootstrap-token\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980140 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1925ea63-7ab4-4dfb-90e0-0527730e9e24-profile-collector-cert\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980187 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ab0befc-0749-4a94-9d57-adc79f211e9d-images\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980219 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95298591-2816-41fe-8c02-5d15ea156d80-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980243 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980264 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqlw\" (UniqueName: \"kubernetes.io/projected/95298591-2816-41fe-8c02-5d15ea156d80-kube-api-access-5kqlw\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980283 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1064a42-3995-4b2a-844d-6b2ea290d5c8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980307 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980336 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd121269-e390-4baf-bf26-24c5fd4dac70-config\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980942 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ab0befc-0749-4a94-9d57-adc79f211e9d-images\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.981062 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95298591-2816-41fe-8c02-5d15ea156d80-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.980341 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-proxy-tls\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.981330 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1925ea63-7ab4-4dfb-90e0-0527730e9e24-srv-cert\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.981365 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htjkk\" (UniqueName: \"kubernetes.io/projected/467cf0eb-8a51-4268-b3a9-b308a52aed81-kube-api-access-htjkk\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.981385 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qztkl\" (UniqueName: \"kubernetes.io/projected/73f80696-3504-4a89-9681-4925daceb257-kube-api-access-qztkl\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.982675 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.982741 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzmn\" (UniqueName: \"kubernetes.io/projected/c7a523a9-8ee9-4bab-8baa-ac393331dd07-kube-api-access-tgzmn\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.982857 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bacce7c-168f-4813-a111-58d7e0228cd5-config\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.982905 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkpcb\" (UniqueName: \"kubernetes.io/projected/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-kube-api-access-qkpcb\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.982930 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e7797f70-8531-42a8-906f-b16e97b9aabc-tmpfs\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983013 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd121269-e390-4baf-bf26-24c5fd4dac70-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983124 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7a523a9-8ee9-4bab-8baa-ac393331dd07-service-ca-bundle\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983155 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298h9\" (UniqueName: \"kubernetes.io/projected/12ae8954-3863-4839-b6e7-e500df9ec73b-kube-api-access-298h9\") pod \"ingress-canary-wd5kx\" (UID: \"12ae8954-3863-4839-b6e7-e500df9ec73b\") " pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983205 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4225b942-8fc1-4b47-906a-f443ddc4aab4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6kzvm\" (UID: \"4225b942-8fc1-4b47-906a-f443ddc4aab4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983446 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95r5r\" (UniqueName: \"kubernetes.io/projected/e7797f70-8531-42a8-906f-b16e97b9aabc-kube-api-access-95r5r\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983803 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-mountpoint-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983852 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-544pr\" (UniqueName: \"kubernetes.io/projected/7ab0befc-0749-4a94-9d57-adc79f211e9d-kube-api-access-544pr\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983883 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ab0befc-0749-4a94-9d57-adc79f211e9d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983910 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-registration-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983933 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-csi-data-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983954 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-secret-volume\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.983978 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7a523a9-8ee9-4bab-8baa-ac393331dd07-service-ca-bundle\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984013 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-registry-certificates\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984133 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee403130-f909-4216-a9ff-8a4cb41d4017-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b95xr\" (UID: \"ee403130-f909-4216-a9ff-8a4cb41d4017\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984176 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd121269-e390-4baf-bf26-24c5fd4dac70-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984213 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxz7\" (UniqueName: \"kubernetes.io/projected/a86e8307-1c84-49f8-ab9e-8602c411ecf9-kube-api-access-7xxz7\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984250 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbx6\" (UniqueName: \"kubernetes.io/projected/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-kube-api-access-rfbx6\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984298 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqcnn\" (UniqueName: \"kubernetes.io/projected/4bacce7c-168f-4813-a111-58d7e0228cd5-kube-api-access-sqcnn\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984367 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-default-certificate\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984399 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984426 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzggs\" (UniqueName: \"kubernetes.io/projected/71a2cf81-3a3f-4bc9-8b67-57ad66576390-kube-api-access-pzggs\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984486 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-registry-tls\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984512 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a86e8307-1c84-49f8-ab9e-8602c411ecf9-srv-cert\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984567 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ab0befc-0749-4a94-9d57-adc79f211e9d-proxy-tls\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984601 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12ae8954-3863-4839-b6e7-e500df9ec73b-cert\") pod \"ingress-canary-wd5kx\" (UID: \"12ae8954-3863-4839-b6e7-e500df9ec73b\") " pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984634 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-kube-api-access-p5482\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984664 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-trusted-ca\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984689 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ml6\" (UniqueName: \"kubernetes.io/projected/4225b942-8fc1-4b47-906a-f443ddc4aab4-kube-api-access-t5ml6\") pod \"multus-admission-controller-857f4d67dd-6kzvm\" (UID: \"4225b942-8fc1-4b47-906a-f443ddc4aab4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984715 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-socket-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984743 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7797f70-8531-42a8-906f-b16e97b9aabc-apiservice-cert\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984765 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtp7p\" (UniqueName: \"kubernetes.io/projected/69e8f812-0fb1-406e-93d0-77093b6344fc-kube-api-access-wtp7p\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984816 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/467cf0eb-8a51-4268-b3a9-b308a52aed81-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984845 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-plugins-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984876 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/71a2cf81-3a3f-4bc9-8b67-57ad66576390-certs\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984909 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1064a42-3995-4b2a-844d-6b2ea290d5c8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984938 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/467cf0eb-8a51-4268-b3a9-b308a52aed81-metrics-tls\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984963 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4pwt\" (UniqueName: \"kubernetes.io/projected/750ea675-e79a-459b-8261-e15dd252a8f1-kube-api-access-s4pwt\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.984984 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-metrics-certs\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985025 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/467cf0eb-8a51-4268-b3a9-b308a52aed81-trusted-ca\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985063 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a86e8307-1c84-49f8-ab9e-8602c411ecf9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985096 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-config-volume\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985140 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92822bda-884a-4bfc-b651-f58624599346-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985203 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a545b4cc-66ea-4191-9d33-e4e90590e5a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rv65w\" (UID: \"a545b4cc-66ea-4191-9d33-e4e90590e5a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985257 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.985206 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-registry-certificates\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986402 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ab0befc-0749-4a94-9d57-adc79f211e9d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986464 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95298591-2816-41fe-8c02-5d15ea156d80-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986492 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88www\" (UniqueName: \"kubernetes.io/projected/9503768e-24bc-4280-ba41-a96116f9523e-kube-api-access-88www\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986537 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9503768e-24bc-4280-ba41-a96116f9523e-signing-key\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986584 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92822bda-884a-4bfc-b651-f58624599346-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986607 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69e8f812-0fb1-406e-93d0-77093b6344fc-config-volume\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986635 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22nml\" (UniqueName: \"kubernetes.io/projected/1925ea63-7ab4-4dfb-90e0-0527730e9e24-kube-api-access-22nml\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986662 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-stats-auth\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.986687 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46hc\" (UniqueName: \"kubernetes.io/projected/a545b4cc-66ea-4191-9d33-e4e90590e5a8-kube-api-access-j46hc\") pod \"package-server-manager-789f6589d5-rv65w\" (UID: \"a545b4cc-66ea-4191-9d33-e4e90590e5a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.989840 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-trusted-ca\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.990169 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/467cf0eb-8a51-4268-b3a9-b308a52aed81-trusted-ca\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.990619 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1064a42-3995-4b2a-844d-6b2ea290d5c8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:06 crc kubenswrapper[4846]: E1122 09:16:06.993172 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.493148657 +0000 UTC m=+142.428838496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:06 crc kubenswrapper[4846]: I1122 09:16:06.994729 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.012709 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92822bda-884a-4bfc-b651-f58624599346-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.013013 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/467cf0eb-8a51-4268-b3a9-b308a52aed81-metrics-tls\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.014709 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-default-certificate\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.019173 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-registry-tls\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.019194 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-proxy-tls\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.019865 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-metrics-certs\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.019967 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee403130-f909-4216-a9ff-8a4cb41d4017-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b95xr\" (UID: \"ee403130-f909-4216-a9ff-8a4cb41d4017\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.020316 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ab0befc-0749-4a94-9d57-adc79f211e9d-proxy-tls\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.020361 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c7a523a9-8ee9-4bab-8baa-ac393331dd07-stats-auth\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.020664 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95298591-2816-41fe-8c02-5d15ea156d80-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.021519 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4225b942-8fc1-4b47-906a-f443ddc4aab4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6kzvm\" (UID: \"4225b942-8fc1-4b47-906a-f443ddc4aab4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.022279 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd121269-e390-4baf-bf26-24c5fd4dac70-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.022314 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.024432 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25zl\" (UniqueName: \"kubernetes.io/projected/ee403130-f909-4216-a9ff-8a4cb41d4017-kube-api-access-q25zl\") pod \"control-plane-machine-set-operator-78cbb6b69f-b95xr\" (UID: \"ee403130-f909-4216-a9ff-8a4cb41d4017\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.039833 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-bound-sa-token\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: W1122 09:16:07.064106 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f79231_ace1_41b1_ae23_f812f404fb67.slice/crio-9cc8cf04e7a3b5b6951252a5e58dd6aef123e53f0a5be166c9eef4ea18218a73 WatchSource:0}: Error finding container 9cc8cf04e7a3b5b6951252a5e58dd6aef123e53f0a5be166c9eef4ea18218a73: Status 404 returned error can't find the container with id 9cc8cf04e7a3b5b6951252a5e58dd6aef123e53f0a5be166c9eef4ea18218a73 Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.087757 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088068 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12ae8954-3863-4839-b6e7-e500df9ec73b-cert\") pod \"ingress-canary-wd5kx\" (UID: \"12ae8954-3863-4839-b6e7-e500df9ec73b\") " pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088131 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-socket-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088159 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7797f70-8531-42a8-906f-b16e97b9aabc-apiservice-cert\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088191 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtp7p\" (UniqueName: \"kubernetes.io/projected/69e8f812-0fb1-406e-93d0-77093b6344fc-kube-api-access-wtp7p\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088241 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-plugins-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088266 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/71a2cf81-3a3f-4bc9-8b67-57ad66576390-certs\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088306 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a86e8307-1c84-49f8-ab9e-8602c411ecf9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088331 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-config-volume\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.088473 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.588401786 +0000 UTC m=+142.524091435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088604 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a545b4cc-66ea-4191-9d33-e4e90590e5a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rv65w\" (UID: \"a545b4cc-66ea-4191-9d33-e4e90590e5a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088642 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88www\" (UniqueName: \"kubernetes.io/projected/9503768e-24bc-4280-ba41-a96116f9523e-kube-api-access-88www\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088677 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088701 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9503768e-24bc-4280-ba41-a96116f9523e-signing-key\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088762 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69e8f812-0fb1-406e-93d0-77093b6344fc-config-volume\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088795 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46hc\" (UniqueName: \"kubernetes.io/projected/a545b4cc-66ea-4191-9d33-e4e90590e5a8-kube-api-access-j46hc\") pod \"package-server-manager-789f6589d5-rv65w\" (UID: \"a545b4cc-66ea-4191-9d33-e4e90590e5a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088834 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22nml\" (UniqueName: \"kubernetes.io/projected/1925ea63-7ab4-4dfb-90e0-0527730e9e24-kube-api-access-22nml\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088874 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bacce7c-168f-4813-a111-58d7e0228cd5-serving-cert\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088918 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69e8f812-0fb1-406e-93d0-77093b6344fc-metrics-tls\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088943 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7797f70-8531-42a8-906f-b16e97b9aabc-webhook-cert\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.088980 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9503768e-24bc-4280-ba41-a96116f9523e-signing-cabundle\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089010 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/71a2cf81-3a3f-4bc9-8b67-57ad66576390-node-bootstrap-token\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089042 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1925ea63-7ab4-4dfb-90e0-0527730e9e24-profile-collector-cert\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089184 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1925ea63-7ab4-4dfb-90e0-0527730e9e24-srv-cert\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089225 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qztkl\" (UniqueName: \"kubernetes.io/projected/73f80696-3504-4a89-9681-4925daceb257-kube-api-access-qztkl\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089279 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bacce7c-168f-4813-a111-58d7e0228cd5-config\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089322 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e7797f70-8531-42a8-906f-b16e97b9aabc-tmpfs\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089369 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-298h9\" (UniqueName: \"kubernetes.io/projected/12ae8954-3863-4839-b6e7-e500df9ec73b-kube-api-access-298h9\") pod \"ingress-canary-wd5kx\" (UID: \"12ae8954-3863-4839-b6e7-e500df9ec73b\") " pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089411 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-mountpoint-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089438 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95r5r\" (UniqueName: \"kubernetes.io/projected/e7797f70-8531-42a8-906f-b16e97b9aabc-kube-api-access-95r5r\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089448 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-config-volume\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089476 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-registration-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089508 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-csi-data-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089529 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-secret-volume\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089551 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxz7\" (UniqueName: \"kubernetes.io/projected/a86e8307-1c84-49f8-ab9e-8602c411ecf9-kube-api-access-7xxz7\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089576 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbx6\" (UniqueName: \"kubernetes.io/projected/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-kube-api-access-rfbx6\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089595 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqcnn\" (UniqueName: \"kubernetes.io/projected/4bacce7c-168f-4813-a111-58d7e0228cd5-kube-api-access-sqcnn\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089619 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzggs\" (UniqueName: \"kubernetes.io/projected/71a2cf81-3a3f-4bc9-8b67-57ad66576390-kube-api-access-pzggs\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089635 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a86e8307-1c84-49f8-ab9e-8602c411ecf9-srv-cert\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.089819 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-socket-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.090240 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqlw\" (UniqueName: \"kubernetes.io/projected/95298591-2816-41fe-8c02-5d15ea156d80-kube-api-access-5kqlw\") pod \"kube-storage-version-migrator-operator-b67b599dd-kr7qw\" (UID: \"95298591-2816-41fe-8c02-5d15ea156d80\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.090329 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-plugins-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.094821 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/71a2cf81-3a3f-4bc9-8b67-57ad66576390-certs\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.095125 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.595099831 +0000 UTC m=+142.530789660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.095576 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69e8f812-0fb1-406e-93d0-77093b6344fc-config-volume\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.095740 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-csi-data-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.096029 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e7797f70-8531-42a8-906f-b16e97b9aabc-tmpfs\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.096214 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-mountpoint-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.103427 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a545b4cc-66ea-4191-9d33-e4e90590e5a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rv65w\" (UID: \"a545b4cc-66ea-4191-9d33-e4e90590e5a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.107604 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73f80696-3504-4a89-9681-4925daceb257-registration-dir\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.107704 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bacce7c-168f-4813-a111-58d7e0228cd5-serving-cert\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.108012 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a86e8307-1c84-49f8-ab9e-8602c411ecf9-srv-cert\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.108374 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e7797f70-8531-42a8-906f-b16e97b9aabc-apiservice-cert\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.108971 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12ae8954-3863-4839-b6e7-e500df9ec73b-cert\") pod \"ingress-canary-wd5kx\" (UID: \"12ae8954-3863-4839-b6e7-e500df9ec73b\") " pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.114057 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1925ea63-7ab4-4dfb-90e0-0527730e9e24-profile-collector-cert\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.120368 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bacce7c-168f-4813-a111-58d7e0228cd5-config\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.127009 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69e8f812-0fb1-406e-93d0-77093b6344fc-metrics-tls\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.127658 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lz8p8"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.132937 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-secret-volume\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.132943 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a86e8307-1c84-49f8-ab9e-8602c411ecf9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.137181 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.156817 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzmn\" (UniqueName: \"kubernetes.io/projected/c7a523a9-8ee9-4bab-8baa-ac393331dd07-kube-api-access-tgzmn\") pod \"router-default-5444994796-tzhcp\" (UID: \"c7a523a9-8ee9-4bab-8baa-ac393331dd07\") " pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.157707 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1925ea63-7ab4-4dfb-90e0-0527730e9e24-srv-cert\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.157855 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/71a2cf81-3a3f-4bc9-8b67-57ad66576390-node-bootstrap-token\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.162077 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkpcb\" (UniqueName: \"kubernetes.io/projected/27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf-kube-api-access-qkpcb\") pod \"machine-config-controller-84d6567774-jjv78\" (UID: \"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.164707 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9503768e-24bc-4280-ba41-a96116f9523e-signing-key\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.164919 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htjkk\" (UniqueName: \"kubernetes.io/projected/467cf0eb-8a51-4268-b3a9-b308a52aed81-kube-api-access-htjkk\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.165811 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1064a42-3995-4b2a-844d-6b2ea290d5c8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.170590 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92822bda-884a-4bfc-b651-f58624599346-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.170990 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9503768e-24bc-4280-ba41-a96116f9523e-signing-cabundle\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.171456 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd121269-e390-4baf-bf26-24c5fd4dac70-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mg7kh\" (UID: \"cd121269-e390-4baf-bf26-24c5fd4dac70\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.177203 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.178846 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1064a42-3995-4b2a-844d-6b2ea290d5c8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jwg48\" (UID: \"c1064a42-3995-4b2a-844d-6b2ea290d5c8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.179231 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.184299 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-544pr\" (UniqueName: \"kubernetes.io/projected/7ab0befc-0749-4a94-9d57-adc79f211e9d-kube-api-access-544pr\") pod \"machine-config-operator-74547568cd-2zbvb\" (UID: \"7ab0befc-0749-4a94-9d57-adc79f211e9d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.189031 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e7797f70-8531-42a8-906f-b16e97b9aabc-webhook-cert\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.191510 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.193011 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-49tss"] Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.193564 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.693509862 +0000 UTC m=+142.629199501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.204887 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.205860 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2fp62"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.220401 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.221189 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ml6\" (UniqueName: \"kubernetes.io/projected/4225b942-8fc1-4b47-906a-f443ddc4aab4-kube-api-access-t5ml6\") pod \"multus-admission-controller-857f4d67dd-6kzvm\" (UID: \"4225b942-8fc1-4b47-906a-f443ddc4aab4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.234484 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4pwt\" (UniqueName: \"kubernetes.io/projected/750ea675-e79a-459b-8261-e15dd252a8f1-kube-api-access-s4pwt\") pod \"marketplace-operator-79b997595-hmwp7\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.250235 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/467cf0eb-8a51-4268-b3a9-b308a52aed81-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kgj2l\" (UID: \"467cf0eb-8a51-4268-b3a9-b308a52aed81\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.263633 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-kube-api-access-p5482\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.266227 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.306325 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.307175 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.807157778 +0000 UTC m=+142.742847437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.335122 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22nml\" (UniqueName: \"kubernetes.io/projected/1925ea63-7ab4-4dfb-90e0-0527730e9e24-kube-api-access-22nml\") pod \"catalog-operator-68c6474976-8lfzj\" (UID: \"1925ea63-7ab4-4dfb-90e0-0527730e9e24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.387674 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtp7p\" (UniqueName: \"kubernetes.io/projected/69e8f812-0fb1-406e-93d0-77093b6344fc-kube-api-access-wtp7p\") pod \"dns-default-mhwdb\" (UID: \"69e8f812-0fb1-406e-93d0-77093b6344fc\") " pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.390128 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-298h9\" (UniqueName: \"kubernetes.io/projected/12ae8954-3863-4839-b6e7-e500df9ec73b-kube-api-access-298h9\") pod \"ingress-canary-wd5kx\" (UID: \"12ae8954-3863-4839-b6e7-e500df9ec73b\") " pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.397111 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zdpb4"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.401120 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n2tpg"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.404882 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88www\" (UniqueName: \"kubernetes.io/projected/9503768e-24bc-4280-ba41-a96116f9523e-kube-api-access-88www\") pod \"service-ca-9c57cc56f-lgbpn\" (UID: \"9503768e-24bc-4280-ba41-a96116f9523e\") " pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.405484 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xxpch"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.407629 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.407892 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.907855015 +0000 UTC m=+142.843544674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.408215 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.408314 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95r5r\" (UniqueName: \"kubernetes.io/projected/e7797f70-8531-42a8-906f-b16e97b9aabc-kube-api-access-95r5r\") pod \"packageserver-d55dfcdfc-tlxsw\" (UID: \"e7797f70-8531-42a8-906f-b16e97b9aabc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.408718 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:07.90870926 +0000 UTC m=+142.844398899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.409002 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46hc\" (UniqueName: \"kubernetes.io/projected/a545b4cc-66ea-4191-9d33-e4e90590e5a8-kube-api-access-j46hc\") pod \"package-server-manager-789f6589d5-rv65w\" (UID: \"a545b4cc-66ea-4191-9d33-e4e90590e5a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.409183 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.421824 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbx6\" (UniqueName: \"kubernetes.io/projected/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-kube-api-access-rfbx6\") pod \"collect-profiles-29396715-nmwg4\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.423665 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.442398 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxz7\" (UniqueName: \"kubernetes.io/projected/a86e8307-1c84-49f8-ab9e-8602c411ecf9-kube-api-access-7xxz7\") pod \"olm-operator-6b444d44fb-jq2bk\" (UID: \"a86e8307-1c84-49f8-ab9e-8602c411ecf9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.444315 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.451550 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.461737 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqcnn\" (UniqueName: \"kubernetes.io/projected/4bacce7c-168f-4813-a111-58d7e0228cd5-kube-api-access-sqcnn\") pod \"service-ca-operator-777779d784-slgpd\" (UID: \"4bacce7c-168f-4813-a111-58d7e0228cd5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.489060 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzggs\" (UniqueName: \"kubernetes.io/projected/71a2cf81-3a3f-4bc9-8b67-57ad66576390-kube-api-access-pzggs\") pod \"machine-config-server-nrm4s\" (UID: \"71a2cf81-3a3f-4bc9-8b67-57ad66576390\") " pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.489319 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.498540 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.513668 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.513817 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.514137 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.014118256 +0000 UTC m=+142.949807905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.526237 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.532625 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.547717 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.548039 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.555594 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.563321 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qztkl\" (UniqueName: \"kubernetes.io/projected/73f80696-3504-4a89-9681-4925daceb257-kube-api-access-qztkl\") pod \"csi-hostpathplugin-tfpk7\" (UID: \"73f80696-3504-4a89-9681-4925daceb257\") " pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.564830 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.576610 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.586859 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.590685 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wd5kx" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.597904 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nrm4s" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.615864 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.616406 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.116387939 +0000 UTC m=+143.052077588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.622990 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.721632 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.721858 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.221835796 +0000 UTC m=+143.157525445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.722953 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.726407 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.226389448 +0000 UTC m=+143.162079097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.830222 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.830599 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.330581108 +0000 UTC m=+143.266270757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.875746 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-n5xpn"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.880165 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wgnrq"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.893318 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" event={"ID":"20741111-12b6-4d66-9743-c51d0b8a1a5b","Type":"ContainerStarted","Data":"ce86bceca9d72b4489c22fa3c9be1ecbf7b356f67ce5d37b0eb9176ca5a5b9ab"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.894764 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" event={"ID":"c9204723-54c5-457c-8bb8-58be85f199e2","Type":"ContainerStarted","Data":"2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.894782 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" event={"ID":"c9204723-54c5-457c-8bb8-58be85f199e2","Type":"ContainerStarted","Data":"59e455ede513eb86dd2a13dbee135765f4d25fa89de7bbd2221f7d8b27c0edd8"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.897118 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.905124 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.905294 4846 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vmbgx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.905339 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" podUID="c9204723-54c5-457c-8bb8-58be85f199e2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.905910 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" event={"ID":"fd5e681f-ca95-4ba0-935e-86f18702cf78","Type":"ContainerStarted","Data":"c9f1302d27bd3065b41bd8f65284614f65d6e7e2d758e286abd551453082cae2"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.907316 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" event={"ID":"22bb9a30-7380-4482-b556-57bed8a7d681","Type":"ContainerStarted","Data":"b66ee4728deac353635c76fe7f9ecd7b19a7ff649a7e54d0b0a44d29e2d6fcad"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.908587 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" event={"ID":"42fec650-4eb0-4cb8-adf9-acaebf0ba09e","Type":"ContainerStarted","Data":"07010e65748fcaf23b2c1257fb6ef1597222d0c3d77fe1bee82b83e4b669976b"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.922206 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" event={"ID":"4b0be692-d108-4051-9a33-6529b4ed1e7b","Type":"ContainerStarted","Data":"a6075d82d81e46cc7556a7aff4878293f5685202e5175f933d74257a98507ffe"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.930545 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" event={"ID":"09e9b382-4c2b-440b-978b-3aab0494d892","Type":"ContainerStarted","Data":"54590e1ebb8fdf994536acd22cdf57da05509f16cf5e4bdd5cdfb9aa22be83a0"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.935195 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" event={"ID":"23993c93-7c44-4d7c-8758-1cf3666212a5","Type":"ContainerStarted","Data":"92eb986569e82457862143929329c4a6e3b41708012f787a8f5d104f907c5e9a"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.935518 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" event={"ID":"23993c93-7c44-4d7c-8758-1cf3666212a5","Type":"ContainerStarted","Data":"e8e64719baf2948dcf874865860897743de7dcd4c30cc5b753d0eb8b317a2054"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.942362 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:07 crc kubenswrapper[4846]: E1122 09:16:07.943183 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.443168973 +0000 UTC m=+143.378858612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.943934 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.947240 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v92tx"] Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.960151 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" event={"ID":"01e5ec75-28e3-4baa-8501-cbe8c740ec3f","Type":"ContainerStarted","Data":"2c83afe419f191f20b35f8ad432774d36970dcde77f50062fb55d73298bbe1d7"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.960208 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" event={"ID":"01e5ec75-28e3-4baa-8501-cbe8c740ec3f","Type":"ContainerStarted","Data":"439286f782b05c7cd04c40aeec94f77687c3087eee6942313d68da2ef403609f"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.970111 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tzhcp" event={"ID":"c7a523a9-8ee9-4bab-8baa-ac393331dd07","Type":"ContainerStarted","Data":"940259bf575fd72745f8a6f1e000928e042ab27d58262ec119470564ab27303e"} Nov 22 09:16:07 crc kubenswrapper[4846]: I1122 09:16:07.977980 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" event={"ID":"8418012a-eb36-472f-a4ea-49d3af8dbd09","Type":"ContainerStarted","Data":"b43b482c62c1f90cb028c0b229e4c996bed085194a980e0f8d4647440f38ff88"} Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.008418 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" event={"ID":"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd","Type":"ContainerStarted","Data":"5d1d5963d637a57f2a13a8f30f17c0ef8759a40bc2dabab3a9ca5858413cf10f"} Nov 22 09:16:08 crc kubenswrapper[4846]: W1122 09:16:08.011509 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb961dfe4_8e3d_4cf8_8032_2293ea7240fe.slice/crio-bce3baa61e7a483f329648e6a2752b13719da6bbaca7c0e545c5d84ad790e11a WatchSource:0}: Error finding container bce3baa61e7a483f329648e6a2752b13719da6bbaca7c0e545c5d84ad790e11a: Status 404 returned error can't find the container with id bce3baa61e7a483f329648e6a2752b13719da6bbaca7c0e545c5d84ad790e11a Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.034445 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" event={"ID":"f0f79231-ace1-41b1-ae23-f812f404fb67","Type":"ContainerStarted","Data":"e1d08638f9a24a290b7d0d338fb25fb4ecd801a4638947113b62796041d401c5"} Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.034504 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" event={"ID":"f0f79231-ace1-41b1-ae23-f812f404fb67","Type":"ContainerStarted","Data":"9cc8cf04e7a3b5b6951252a5e58dd6aef123e53f0a5be166c9eef4ea18218a73"} Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.043555 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.043711 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.543667265 +0000 UTC m=+143.479356914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.044018 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.045441 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.545425846 +0000 UTC m=+143.481115495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.135217 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.145189 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.146920 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.646852875 +0000 UTC m=+143.582542654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.146931 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.152617 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.172470 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.176838 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.179282 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k86mj"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.210712 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" podStartSLOduration=122.209332368 podStartE2EDuration="2m2.209332368s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:08.201896791 +0000 UTC m=+143.137586440" watchObservedRunningTime="2025-11-22 09:16:08.209332368 +0000 UTC m=+143.145022017" Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.248564 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.249004 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.748989795 +0000 UTC m=+143.684679444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.302229 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.350916 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.356971 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6kzvm"] Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.357271 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.857238393 +0000 UTC m=+143.792928042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.360199 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.387604 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.392148 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.462980 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.463568 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:08.963542874 +0000 UTC m=+143.899232523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.566908 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.567855 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.067831697 +0000 UTC m=+144.003521346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.582231 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-szfmb" podStartSLOduration=123.582207626 podStartE2EDuration="2m3.582207626s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:08.577270402 +0000 UTC m=+143.512960051" watchObservedRunningTime="2025-11-22 09:16:08.582207626 +0000 UTC m=+143.517897275" Nov 22 09:16:08 crc kubenswrapper[4846]: W1122 09:16:08.597517 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a54ed2d_f7cd_440a_86c3_4c82ce070ac0.slice/crio-a9d090d770f6db273e0348a40b8005a339c36ba4486f5cf25c8dad6dace03b08 WatchSource:0}: Error finding container a9d090d770f6db273e0348a40b8005a339c36ba4486f5cf25c8dad6dace03b08: Status 404 returned error can't find the container with id a9d090d770f6db273e0348a40b8005a339c36ba4486f5cf25c8dad6dace03b08 Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.669737 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.670478 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.17046087 +0000 UTC m=+144.106150519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: W1122 09:16:08.708810 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd121269_e390_4baf_bf26_24c5fd4dac70.slice/crio-1916ca69306a2cc2e07f57fff6fc9885aff152ba46d6237a8c450ed79d1b2486 WatchSource:0}: Error finding container 1916ca69306a2cc2e07f57fff6fc9885aff152ba46d6237a8c450ed79d1b2486: Status 404 returned error can't find the container with id 1916ca69306a2cc2e07f57fff6fc9885aff152ba46d6237a8c450ed79d1b2486 Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.773079 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.773587 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.273567428 +0000 UTC m=+144.209257067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.799528 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mhwdb"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.833788 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.837261 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-slgpd"] Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.874974 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.875651 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.375625756 +0000 UTC m=+144.311315395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.926181 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9qpp" podStartSLOduration=122.92614842 podStartE2EDuration="2m2.92614842s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:08.920904027 +0000 UTC m=+143.856593686" watchObservedRunningTime="2025-11-22 09:16:08.92614842 +0000 UTC m=+143.861838079" Nov 22 09:16:08 crc kubenswrapper[4846]: I1122 09:16:08.978329 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:08 crc kubenswrapper[4846]: E1122 09:16:08.978708 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.478692213 +0000 UTC m=+144.414381862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.007480 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.033572 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.033657 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tfpk7"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.033672 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hmwp7"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.039402 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.076261 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lgbpn"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.076858 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" event={"ID":"b961dfe4-8e3d-4cf8-8032-2293ea7240fe","Type":"ContainerStarted","Data":"bce3baa61e7a483f329648e6a2752b13719da6bbaca7c0e545c5d84ad790e11a"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.080171 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" event={"ID":"c1064a42-3995-4b2a-844d-6b2ea290d5c8","Type":"ContainerStarted","Data":"e149331ce2c50a169e5a4a2facebe7d2f5308f8744223c8bd95eea95a510826b"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.083730 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.085194 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.585147608 +0000 UTC m=+144.520837257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.107816 4846 generic.go:334] "Generic (PLEG): container finished" podID="22bb9a30-7380-4482-b556-57bed8a7d681" containerID="e7faf9594e1377297b5d1268fe309a1a8822182f83eb43bdc3b7eff0f4af057b" exitCode=0 Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.107944 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" event={"ID":"22bb9a30-7380-4482-b556-57bed8a7d681","Type":"ContainerDied","Data":"e7faf9594e1377297b5d1268fe309a1a8822182f83eb43bdc3b7eff0f4af057b"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.110454 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" event={"ID":"ee403130-f909-4216-a9ff-8a4cb41d4017","Type":"ContainerStarted","Data":"e24b46fabbeadc38c0adfb2f0b5c4c6992d21879c68897743bce4b6b7b639654"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.112403 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.113573 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" event={"ID":"9ee64233-aae3-4a4a-815d-ca55dd93bfb4","Type":"ContainerStarted","Data":"111393885a7d927e3221379b6e4c4dadc15ffaa6d355926b4c0b7b6c0906061e"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.114583 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wd5kx"] Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.115913 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nrm4s" event={"ID":"71a2cf81-3a3f-4bc9-8b67-57ad66576390","Type":"ContainerStarted","Data":"b74fe327fef6f2c70d43039283f477e9ce8d0b7838bd3dafb84c33a9e6cb8ef1"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.116614 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" event={"ID":"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf","Type":"ContainerStarted","Data":"ed875c91d53355f6e3399632cfe1a5e2cc75491880e0052edecf724424949894"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.123883 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" event={"ID":"01e5ec75-28e3-4baa-8501-cbe8c740ec3f","Type":"ContainerStarted","Data":"a4ec0a2b7ce4d61b83bfc2fd86a6a9748e341ebc74d3b3c65b063c514fe891f9"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.130765 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" event={"ID":"1925ea63-7ab4-4dfb-90e0-0527730e9e24","Type":"ContainerStarted","Data":"b2c8f9cc50da3a9311c5d63bbe84dfdd67172b68c2d83aaeb1fd9f79d31534ad"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.134634 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" event={"ID":"98d61b3e-6191-4d14-823a-f791ddf65cae","Type":"ContainerStarted","Data":"3896f7efc2bcaefb81e40548acaa4a8f70e785e50b68878a454137fe0cbcaf19"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.134738 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" event={"ID":"98d61b3e-6191-4d14-823a-f791ddf65cae","Type":"ContainerStarted","Data":"df905ee694d2856caab7c260345de4a29889ace63c7ffafa9cfaee3430008e74"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.139975 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" event={"ID":"95298591-2816-41fe-8c02-5d15ea156d80","Type":"ContainerStarted","Data":"b6dce936c680d4524f95f2740c7f0eb22554d2757216dfd4f3a821b112f5e174"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.150691 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tzhcp" event={"ID":"c7a523a9-8ee9-4bab-8baa-ac393331dd07","Type":"ContainerStarted","Data":"ab624ec597a30fd9ec8c40279d648d25160b7a2781ab2ca5cb434077a3d1c594"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.171686 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" event={"ID":"4b0be692-d108-4051-9a33-6529b4ed1e7b","Type":"ContainerStarted","Data":"3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.172203 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.174430 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" event={"ID":"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0","Type":"ContainerStarted","Data":"a9d090d770f6db273e0348a40b8005a339c36ba4486f5cf25c8dad6dace03b08"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.174898 4846 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2fp62 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.174928 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" podUID="4b0be692-d108-4051-9a33-6529b4ed1e7b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.180490 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" event={"ID":"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3","Type":"ContainerStarted","Data":"028bb87d08c2bd69a6a0134752b77f964bfb97e65f5e0af325bc580d45334d43"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.185126 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.185448 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.685425254 +0000 UTC m=+144.621114903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.199121 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" event={"ID":"42fec650-4eb0-4cb8-adf9-acaebf0ba09e","Type":"ContainerStarted","Data":"fa2f1951517032594f1744c8cbe4abc7df3c391f67cdb2d702f7c2b3b8626743"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.199202 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.201621 4846 patch_prober.go:28] interesting pod/console-operator-58897d9998-zdpb4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.201805 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" podUID="42fec650-4eb0-4cb8-adf9-acaebf0ba09e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.202178 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" event={"ID":"fd5e681f-ca95-4ba0-935e-86f18702cf78","Type":"ContainerStarted","Data":"06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.202690 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.204859 4846 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xxpch container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.204906 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" podUID="fd5e681f-ca95-4ba0-935e-86f18702cf78" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.220937 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" event={"ID":"8418012a-eb36-472f-a4ea-49d3af8dbd09","Type":"ContainerStarted","Data":"ddb3038c512753324786f28a6a9898781d3609fc00e0fedf28fb5f8ef41e88a6"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.232656 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" event={"ID":"20741111-12b6-4d66-9743-c51d0b8a1a5b","Type":"ContainerStarted","Data":"994a61a74eccc89a08d2c17fe14dc1c52c80b1c185631012dea1901a07c349cf"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.258788 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhwdb" event={"ID":"69e8f812-0fb1-406e-93d0-77093b6344fc","Type":"ContainerStarted","Data":"81ddabb34a8ea705b4cb1df0f2ea913d4dcad75be053e5cf7ee6ca482738d41e"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.262755 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" event={"ID":"4225b942-8fc1-4b47-906a-f443ddc4aab4","Type":"ContainerStarted","Data":"3506f79504701986ec964eef2f4b69b25cbc7df47094df2bfaf23c621efa312f"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.267325 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" event={"ID":"7ab0befc-0749-4a94-9d57-adc79f211e9d","Type":"ContainerStarted","Data":"7153444b998db4eedc202583c9af35d6cc7270a04ee0df92f9f2ee82d808ff75"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.287546 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.290038 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.790016585 +0000 UTC m=+144.725706234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.292898 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" event={"ID":"9f953d76-d324-4923-8767-534c7fec6648","Type":"ContainerStarted","Data":"04e4a1295218a4deef33bd2b90df551af0decf49bb05b1c3c2507f3fd5e48d94"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.292966 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" event={"ID":"9f953d76-d324-4923-8767-534c7fec6648","Type":"ContainerStarted","Data":"def8a17ce3968ab116392e4a75974b73e2987121a6ce7230f31997b1a5593487"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.320826 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k86mj" event={"ID":"f23592b0-b045-4aa5-a22f-c15133890ed4","Type":"ContainerStarted","Data":"7372051fc3a553594ec77d5e8fbd2c9f895cc2418a8f4503d46c06d912f9ce60"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.333944 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" event={"ID":"4bacce7c-168f-4813-a111-58d7e0228cd5","Type":"ContainerStarted","Data":"74e87631f3fa7082f033ed2add8c2799de459fde19220c00d1f4d51bb53c6464"} Nov 22 09:16:09 crc kubenswrapper[4846]: W1122 09:16:09.348878 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod750ea675_e79a_459b_8261_e15dd252a8f1.slice/crio-453b4a338b3b8d4c3e80a93e8c94ca0d6fb5fd67938cf0728c192a1de760639d WatchSource:0}: Error finding container 453b4a338b3b8d4c3e80a93e8c94ca0d6fb5fd67938cf0728c192a1de760639d: Status 404 returned error can't find the container with id 453b4a338b3b8d4c3e80a93e8c94ca0d6fb5fd67938cf0728c192a1de760639d Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.350026 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-n5xpn" event={"ID":"6cc4154d-473a-46cf-acf2-6978d0e642ee","Type":"ContainerStarted","Data":"80cdf562445b7d9b961389c02d6317860c6be1b4f6f1e37f744a8fb70a714d4b"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.350555 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.352458 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.352507 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.352909 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" event={"ID":"e4b6cce2-f501-46bb-af41-3933baf3205c","Type":"ContainerStarted","Data":"2ead725ace9124c831f1ee4eab6ee1564c6e583efae0625acdcc349ed48cce57"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.359114 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" event={"ID":"cd121269-e390-4baf-bf26-24c5fd4dac70","Type":"ContainerStarted","Data":"1916ca69306a2cc2e07f57fff6fc9885aff152ba46d6237a8c450ed79d1b2486"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.366559 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" event={"ID":"09e9b382-4c2b-440b-978b-3aab0494d892","Type":"ContainerStarted","Data":"068a0049b0a6e431ef0a20bf03b040041fcab3442890857d7bf12889e0f5bf72"} Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.388315 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.391403 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:09.891373882 +0000 UTC m=+144.827063531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.403171 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.427236 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.433279 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:09 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:09 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:09 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.433351 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.497076 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.517171 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.016955136 +0000 UTC m=+144.952644785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.599813 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.600296 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.100273557 +0000 UTC m=+145.035963196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.705073 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.705640 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.20562345 +0000 UTC m=+145.141313099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.809704 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.811375 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.311340495 +0000 UTC m=+145.247030144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:09 crc kubenswrapper[4846]: I1122 09:16:09.911878 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:09 crc kubenswrapper[4846]: E1122 09:16:09.912295 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.41228255 +0000 UTC m=+145.347972199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.013116 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.013570 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.513550464 +0000 UTC m=+145.449240113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.018774 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lz8p8" podStartSLOduration=124.018747356 podStartE2EDuration="2m4.018747356s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.014397989 +0000 UTC m=+144.950087638" watchObservedRunningTime="2025-11-22 09:16:10.018747356 +0000 UTC m=+144.954437015" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.055893 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" podStartSLOduration=124.055871579 podStartE2EDuration="2m4.055871579s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.05487776 +0000 UTC m=+144.990567419" watchObservedRunningTime="2025-11-22 09:16:10.055871579 +0000 UTC m=+144.991561228" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.118239 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.118640 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.61862897 +0000 UTC m=+145.554318619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.161208 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-49tss" podStartSLOduration=124.161173331 podStartE2EDuration="2m4.161173331s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.149717377 +0000 UTC m=+145.085407046" watchObservedRunningTime="2025-11-22 09:16:10.161173331 +0000 UTC m=+145.096862990" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.221228 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.221396 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.721353866 +0000 UTC m=+145.657043505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.227934 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.228636 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.728619348 +0000 UTC m=+145.664308997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.230334 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" podStartSLOduration=125.230317148 podStartE2EDuration="2m5.230317148s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.228937128 +0000 UTC m=+145.164626787" watchObservedRunningTime="2025-11-22 09:16:10.230317148 +0000 UTC m=+145.166006797" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.265485 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v92tx" podStartSLOduration=125.265462533 podStartE2EDuration="2m5.265462533s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.263084044 +0000 UTC m=+145.198773693" watchObservedRunningTime="2025-11-22 09:16:10.265462533 +0000 UTC m=+145.201152182" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.329795 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.330319 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.830299995 +0000 UTC m=+145.765989634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.334764 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" podStartSLOduration=125.334733514 podStartE2EDuration="2m5.334733514s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.329976885 +0000 UTC m=+145.265666534" watchObservedRunningTime="2025-11-22 09:16:10.334733514 +0000 UTC m=+145.270423203" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.362728 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zgtrh" podStartSLOduration=124.36268829 podStartE2EDuration="2m4.36268829s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.30169653 +0000 UTC m=+145.237386179" watchObservedRunningTime="2025-11-22 09:16:10.36268829 +0000 UTC m=+145.298377939" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.392578 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-n5xpn" podStartSLOduration=125.392540271 podStartE2EDuration="2m5.392540271s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.372584829 +0000 UTC m=+145.308274488" watchObservedRunningTime="2025-11-22 09:16:10.392540271 +0000 UTC m=+145.328229920" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.397548 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" event={"ID":"9ee64233-aae3-4a4a-815d-ca55dd93bfb4","Type":"ContainerStarted","Data":"c0c2840dea7de23cf2b6ef811ef49bf01b8e7494cea1452e8fd9bd1ce026d027"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.397616 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" event={"ID":"9ee64233-aae3-4a4a-815d-ca55dd93bfb4","Type":"ContainerStarted","Data":"2eaa560b55c9c60909e53a11c179b54b22ebe8516114d8b8dd250515ebfd04c4"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.415790 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-n5xpn" event={"ID":"6cc4154d-473a-46cf-acf2-6978d0e642ee","Type":"ContainerStarted","Data":"ab5d6bfd0fd1a0d933b3ed149a7225379edabc6e5e64f77c48f11172a5de43a3"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.418701 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.418751 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.433371 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.434308 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:10 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:10 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:10 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.434389 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.437018 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tzhcp" podStartSLOduration=124.436998378 podStartE2EDuration="2m4.436998378s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.429358885 +0000 UTC m=+145.365048534" watchObservedRunningTime="2025-11-22 09:16:10.436998378 +0000 UTC m=+145.372688017" Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.437325 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:10.937303887 +0000 UTC m=+145.872993536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.486138 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" event={"ID":"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0","Type":"ContainerStarted","Data":"898b8837d050f320f57d4e5faef53c05c2bd8a6a9e7ea86754c7389f5be690a4"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.487957 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xjfk" podStartSLOduration=125.487944774 podStartE2EDuration="2m5.487944774s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.4857434 +0000 UTC m=+145.421433059" watchObservedRunningTime="2025-11-22 09:16:10.487944774 +0000 UTC m=+145.423634423" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.514331 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" podStartSLOduration=70.514309963 podStartE2EDuration="1m10.514309963s" podCreationTimestamp="2025-11-22 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.513256543 +0000 UTC m=+145.448946192" watchObservedRunningTime="2025-11-22 09:16:10.514309963 +0000 UTC m=+145.449999612" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.534340 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.536602 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.036530352 +0000 UTC m=+145.972220001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.536908 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" event={"ID":"c1064a42-3995-4b2a-844d-6b2ea290d5c8","Type":"ContainerStarted","Data":"c9c2aadd19abde1618a3a4a0e0fbc93bf9d240b87ff1de183f8c0c83d1674088"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.547330 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" event={"ID":"73f80696-3504-4a89-9681-4925daceb257","Type":"ContainerStarted","Data":"9be123181678e9ff7c50033749951935f3448cddd9c8be829fd2451d49af6c63"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.570470 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jwg48" podStartSLOduration=124.570437541 podStartE2EDuration="2m4.570437541s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.562906811 +0000 UTC m=+145.498596480" watchObservedRunningTime="2025-11-22 09:16:10.570437541 +0000 UTC m=+145.506127190" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.581269 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" event={"ID":"1925ea63-7ab4-4dfb-90e0-0527730e9e24","Type":"ContainerStarted","Data":"a7b11d1a8d08dcf7f357135153c05ef35c09c07a86db318df3a44eaa9d676890"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.581452 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.583844 4846 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8lfzj container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.583887 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" podUID="1925ea63-7ab4-4dfb-90e0-0527730e9e24" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.584927 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" event={"ID":"cd121269-e390-4baf-bf26-24c5fd4dac70","Type":"ContainerStarted","Data":"ac600040e304cbb1caccb83250f71ddf57d45fa2abbacdcaa0212edf1d371ee9"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.605828 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" event={"ID":"95298591-2816-41fe-8c02-5d15ea156d80","Type":"ContainerStarted","Data":"61fef5b5eecbb9076dd03434409e0c4fb746951d6c1871526fc282fdc146b2e1"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.606992 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" podStartSLOduration=124.606964376 podStartE2EDuration="2m4.606964376s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.604957238 +0000 UTC m=+145.540646887" watchObservedRunningTime="2025-11-22 09:16:10.606964376 +0000 UTC m=+145.542654025" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.607850 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wd5kx" event={"ID":"12ae8954-3863-4839-b6e7-e500df9ec73b","Type":"ContainerStarted","Data":"defc9e33c229b41378026e9216797e5591ecde5927ed438f24d7c5c6a8c84517"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.609495 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k86mj" event={"ID":"f23592b0-b045-4aa5-a22f-c15133890ed4","Type":"ContainerStarted","Data":"e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.629517 4846 generic.go:334] "Generic (PLEG): container finished" podID="b961dfe4-8e3d-4cf8-8032-2293ea7240fe" containerID="0f2063f423c26066f916cdc4ede1c731de31fd840c6e5818b0a31456a541d52b" exitCode=0 Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.629602 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" event={"ID":"b961dfe4-8e3d-4cf8-8032-2293ea7240fe","Type":"ContainerDied","Data":"0f2063f423c26066f916cdc4ede1c731de31fd840c6e5818b0a31456a541d52b"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.637305 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.637768 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.137736094 +0000 UTC m=+146.073425743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.646672 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mg7kh" podStartSLOduration=124.646652044 podStartE2EDuration="2m4.646652044s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.6424056 +0000 UTC m=+145.578095249" watchObservedRunningTime="2025-11-22 09:16:10.646652044 +0000 UTC m=+145.582341693" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.659074 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" event={"ID":"7ab0befc-0749-4a94-9d57-adc79f211e9d","Type":"ContainerStarted","Data":"280b1dccdddb89def714b50dd9294f8ec7cfccac06413708439bed89b040601f"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.693496 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-k86mj" podStartSLOduration=125.693305625 podStartE2EDuration="2m5.693305625s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.690295648 +0000 UTC m=+145.625985297" watchObservedRunningTime="2025-11-22 09:16:10.693305625 +0000 UTC m=+145.628995274" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.707425 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" event={"ID":"4bacce7c-168f-4813-a111-58d7e0228cd5","Type":"ContainerStarted","Data":"6c18fe3a472ab225b923100249eedd3e14e5aa3b2ae77a20c321e31ad284c6eb"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.738769 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.739200 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.239161803 +0000 UTC m=+146.174851452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.739681 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.742812 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.242797439 +0000 UTC m=+146.178487088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.767424 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" event={"ID":"a545b4cc-66ea-4191-9d33-e4e90590e5a8","Type":"ContainerStarted","Data":"1710496f17881a1da3a2473a300ecb01973babd647566114c766047db1349036"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.809164 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kr7qw" podStartSLOduration=124.809143355 podStartE2EDuration="2m4.809143355s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.802078169 +0000 UTC m=+145.737767818" watchObservedRunningTime="2025-11-22 09:16:10.809143355 +0000 UTC m=+145.744833004" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.810090 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" event={"ID":"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf","Type":"ContainerStarted","Data":"258878fe169d9c177d229567f83ed1186c597b06c68cf180a5741cc2419ffa2e"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.840976 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.842376 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.342356044 +0000 UTC m=+146.278045693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.863011 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" event={"ID":"e7797f70-8531-42a8-906f-b16e97b9aabc","Type":"ContainerStarted","Data":"9aa43707b3250acff316d2a7cf86a199bd437ca3ef2e2c52ac99f8e2e4e3c345"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.863104 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" event={"ID":"e7797f70-8531-42a8-906f-b16e97b9aabc","Type":"ContainerStarted","Data":"ed4c3b7388cb15ba4b379e4ae0bf7b1f8114bdf6544217e3ce23fb8e89edf9e4"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.864140 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.877928 4846 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tlxsw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.878011 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" podUID="e7797f70-8531-42a8-906f-b16e97b9aabc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.878537 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" event={"ID":"ee403130-f909-4216-a9ff-8a4cb41d4017","Type":"ContainerStarted","Data":"c5cb14dd81bb4dda917a0a717e5a572be4cfa1eda536ba3cf244f7215bfe1464"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.891662 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-slgpd" podStartSLOduration=124.891620051 podStartE2EDuration="2m4.891620051s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.885694588 +0000 UTC m=+145.821384237" watchObservedRunningTime="2025-11-22 09:16:10.891620051 +0000 UTC m=+145.827309700" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.908271 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" event={"ID":"750ea675-e79a-459b-8261-e15dd252a8f1","Type":"ContainerStarted","Data":"453b4a338b3b8d4c3e80a93e8c94ca0d6fb5fd67938cf0728c192a1de760639d"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.909506 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.918033 4846 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hmwp7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.918123 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.946152 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.946161 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" podStartSLOduration=124.946112171 podStartE2EDuration="2m4.946112171s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.945569525 +0000 UTC m=+145.881259174" watchObservedRunningTime="2025-11-22 09:16:10.946112171 +0000 UTC m=+145.881801820" Nov 22 09:16:10 crc kubenswrapper[4846]: E1122 09:16:10.946926 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.446911874 +0000 UTC m=+146.382601723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.950558 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" event={"ID":"467cf0eb-8a51-4268-b3a9-b308a52aed81","Type":"ContainerStarted","Data":"ddb44fe5313d353739f8d7f090bbe6f9ebc595f415379ed8c4dbb572486cdd42"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.958742 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" event={"ID":"4225b942-8fc1-4b47-906a-f443ddc4aab4","Type":"ContainerStarted","Data":"74beb3f6f41cc5798c52f40cfb7301755a5a23b080f3d7b659793cc15f80fffa"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.968566 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" event={"ID":"d61b0632-6ae3-43fa-b1d5-ebe9671be6cd","Type":"ContainerStarted","Data":"2da3be7177954f7b967c1f0751eacd94bbe1458f9224a8e52074f09785c4a10b"} Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.982842 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b95xr" podStartSLOduration=124.982796991 podStartE2EDuration="2m4.982796991s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:10.981005819 +0000 UTC m=+145.916695478" watchObservedRunningTime="2025-11-22 09:16:10.982796991 +0000 UTC m=+145.918486640" Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.986071 4846 generic.go:334] "Generic (PLEG): container finished" podID="ddb3fe43-72d0-41e4-871e-0fa81e7a52a3" containerID="c122a353794c70567793d8cf57b7fcbab472f5b54a1757dcccb96e4eff6bc653" exitCode=0 Nov 22 09:16:10 crc kubenswrapper[4846]: I1122 09:16:10.986250 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" event={"ID":"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3","Type":"ContainerDied","Data":"c122a353794c70567793d8cf57b7fcbab472f5b54a1757dcccb96e4eff6bc653"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.052105 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" event={"ID":"9503768e-24bc-4280-ba41-a96116f9523e","Type":"ContainerStarted","Data":"6d1e948e2ad433ea363c542a314a1a857874a720b1f18c35bf79237495756c9f"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.054024 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.071671 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" podStartSLOduration=125.071643073 podStartE2EDuration="2m5.071643073s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.037634711 +0000 UTC m=+145.973324370" watchObservedRunningTime="2025-11-22 09:16:11.071643073 +0000 UTC m=+146.007332722" Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.091765 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.591732569 +0000 UTC m=+146.527422208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.095476 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g4frl" podStartSLOduration=126.095452428 podStartE2EDuration="2m6.095452428s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.095363265 +0000 UTC m=+146.031052934" watchObservedRunningTime="2025-11-22 09:16:11.095452428 +0000 UTC m=+146.031142077" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.127567 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nrm4s" event={"ID":"71a2cf81-3a3f-4bc9-8b67-57ad66576390","Type":"ContainerStarted","Data":"01ceeaad13fc64f71fc1bccbb25c70e986ad1dfd874a732a9aab1de2b2f9b28e"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.167249 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.175325 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.675305157 +0000 UTC m=+146.610994806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.206969 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" event={"ID":"a86e8307-1c84-49f8-ab9e-8602c411ecf9","Type":"ContainerStarted","Data":"6d7a50dc7c1a79bc2c279f20bd785793ce15419c84e34a45f9375d80f97b4824"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.207023 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" event={"ID":"a86e8307-1c84-49f8-ab9e-8602c411ecf9","Type":"ContainerStarted","Data":"56845293c53eda7eaeba0aeb130824d6451bcae1fb3c9e852069a0d4a9cb6e84"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.207958 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.209713 4846 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-jq2bk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.209755 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" podUID="a86e8307-1c84-49f8-ab9e-8602c411ecf9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.216208 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" podStartSLOduration=125.21617516 podStartE2EDuration="2m5.21617516s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.194843147 +0000 UTC m=+146.130532796" watchObservedRunningTime="2025-11-22 09:16:11.21617516 +0000 UTC m=+146.151864809" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.268692 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.270114 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.770038241 +0000 UTC m=+146.705727890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.273065 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" event={"ID":"9f953d76-d324-4923-8767-534c7fec6648","Type":"ContainerStarted","Data":"f2d25bce95818e20293a1f93f543a88938bfc1a545cdd8c42e3bd6a1a8134371"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.277819 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" event={"ID":"e4b6cce2-f501-46bb-af41-3933baf3205c","Type":"ContainerStarted","Data":"2f47595af97ea66a055f71e63ddbe83e9df165c03d715ea09e2f8061164ae61d"} Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.296324 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-nrm4s" podStartSLOduration=7.296299527 podStartE2EDuration="7.296299527s" podCreationTimestamp="2025-11-22 09:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.242817227 +0000 UTC m=+146.178506876" watchObservedRunningTime="2025-11-22 09:16:11.296299527 +0000 UTC m=+146.231989206" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.306806 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.319675 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" podStartSLOduration=125.319653179 podStartE2EDuration="2m5.319653179s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.295249257 +0000 UTC m=+146.230938906" watchObservedRunningTime="2025-11-22 09:16:11.319653179 +0000 UTC m=+146.255342838" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.320815 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l7j2f" podStartSLOduration=125.320809012 podStartE2EDuration="2m5.320809012s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.316063484 +0000 UTC m=+146.251753153" watchObservedRunningTime="2025-11-22 09:16:11.320809012 +0000 UTC m=+146.256498661" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.329865 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.370068 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.373075 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.873060047 +0000 UTC m=+146.808749696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.425805 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-268n8" podStartSLOduration=125.425782485 podStartE2EDuration="2m5.425782485s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:11.356820373 +0000 UTC m=+146.292510022" watchObservedRunningTime="2025-11-22 09:16:11.425782485 +0000 UTC m=+146.361472124" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.446326 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:11 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:11 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:11 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.446400 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.471599 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.472016 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:11.971995913 +0000 UTC m=+146.907685562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.574481 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.574957 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.074938566 +0000 UTC m=+147.010628215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.675173 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.675449 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.175405518 +0000 UTC m=+147.111095177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.675764 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.676308 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.176297134 +0000 UTC m=+147.111986783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.777334 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.777742 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.277725393 +0000 UTC m=+147.213415042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.879560 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.880136 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.38010744 +0000 UTC m=+147.315797279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:11 crc kubenswrapper[4846]: I1122 09:16:11.980732 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:11 crc kubenswrapper[4846]: E1122 09:16:11.981201 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.481162828 +0000 UTC m=+147.416852477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.083267 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.083792 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.583769331 +0000 UTC m=+147.519458980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.184361 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.184627 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.684580722 +0000 UTC m=+147.620270371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.184812 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.185292 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.685272763 +0000 UTC m=+147.620962412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.279191 4846 patch_prober.go:28] interesting pod/console-operator-58897d9998-zdpb4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.279280 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" podUID="42fec650-4eb0-4cb8-adf9-acaebf0ba09e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.283787 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lgbpn" event={"ID":"9503768e-24bc-4280-ba41-a96116f9523e","Type":"ContainerStarted","Data":"902c2b3c7092fa2c290870846d32194b04e7bdc67f9ba30724228c08860632aa"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.285321 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.285775 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.785757463 +0000 UTC m=+147.721447112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.286935 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" event={"ID":"4225b942-8fc1-4b47-906a-f443ddc4aab4","Type":"ContainerStarted","Data":"8e368929140cd2d55a828d851938674a1130c8e76b727b7bf97e319039c1354a"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.288997 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" event={"ID":"7ab0befc-0749-4a94-9d57-adc79f211e9d","Type":"ContainerStarted","Data":"cf0532eaab14f9aebcc4873100a931d273eb90c2ea6959edd298332680fd8f37"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.290672 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" event={"ID":"09e9b382-4c2b-440b-978b-3aab0494d892","Type":"ContainerStarted","Data":"64b39de7fd4cc0cb667f9b4be8c8c986580e03bf3f4b480a406700c989b0cb1d"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.293114 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" event={"ID":"b961dfe4-8e3d-4cf8-8032-2293ea7240fe","Type":"ContainerStarted","Data":"27bcd9878585d5157326a6de1f9b9bed59b187d728beb335d6cc18ca98fdd629"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.293138 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" event={"ID":"b961dfe4-8e3d-4cf8-8032-2293ea7240fe","Type":"ContainerStarted","Data":"373b730e44591f267e33297f0c3613c165713fd18110a9c3a10e92a1342f184f"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.294866 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" event={"ID":"27b8d18d-4020-4e6f-b0af-9ca8ebe8b2bf","Type":"ContainerStarted","Data":"db3763700163ad9ab6079fbc460c6df6aee6a10d3a2b9d0e1710395a64bb419f"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.297953 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" event={"ID":"22bb9a30-7380-4482-b556-57bed8a7d681","Type":"ContainerStarted","Data":"a9b98c10a65465df6921a95dce5e51be82944ffb893c424e4dded469056618f9"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.298080 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.300731 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" event={"ID":"a545b4cc-66ea-4191-9d33-e4e90590e5a8","Type":"ContainerStarted","Data":"582ff6e2b7b262bb4e3d5075b6e2b9623427d72462b229f067cab56a4c7df909"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.300860 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" event={"ID":"a545b4cc-66ea-4191-9d33-e4e90590e5a8","Type":"ContainerStarted","Data":"c1d873fab937d39ed7a14d03297738e0a598837fec18b8a6a00b3942a663f28f"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.300904 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.302349 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wd5kx" event={"ID":"12ae8954-3863-4839-b6e7-e500df9ec73b","Type":"ContainerStarted","Data":"c38a335d6ea4d6969e9e04430d27b405837d47939f664398e23950bbb0f377e6"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.303581 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" event={"ID":"750ea675-e79a-459b-8261-e15dd252a8f1","Type":"ContainerStarted","Data":"81c67e3f1d03d80f0b87aa0de08e7a8e1c81ba755756e0a6bfc0276753564f4c"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.304929 4846 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hmwp7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.305467 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.310948 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" event={"ID":"467cf0eb-8a51-4268-b3a9-b308a52aed81","Type":"ContainerStarted","Data":"d1e5d8df878174514f760816a1f07548cab29030c5d4a884ffce7291e6d21bc8"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.310992 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" event={"ID":"467cf0eb-8a51-4268-b3a9-b308a52aed81","Type":"ContainerStarted","Data":"dc724cb1facda45fae93a242b6f91b4a09a08c9bacf8de253e815544a0639409"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.313447 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhwdb" event={"ID":"69e8f812-0fb1-406e-93d0-77093b6344fc","Type":"ContainerStarted","Data":"a975274ad4b1fed79478bd5abdf18b7f072b879854cee9c92474e82b1110254c"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.313492 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhwdb" event={"ID":"69e8f812-0fb1-406e-93d0-77093b6344fc","Type":"ContainerStarted","Data":"8e59315785bdf81d6666dcf842839e5701466c511904fb03aa00a7b531f1210a"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.313607 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.315930 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" event={"ID":"ddb3fe43-72d0-41e4-871e-0fa81e7a52a3","Type":"ContainerStarted","Data":"3e643cebf03744ff78849b8f2fe48dcaa5e2a284c7fe2092439bd3fc54aa1211"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.318124 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" event={"ID":"73f80696-3504-4a89-9681-4925daceb257","Type":"ContainerStarted","Data":"b8a938dd505b77d974c780eec90061c3aa3fd4ce3b1de8bb511d56abf6c713ab"} Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.320228 4846 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tlxsw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.320269 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" podUID="e7797f70-8531-42a8-906f-b16e97b9aabc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.321024 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.321479 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.345208 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jq2bk" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.375476 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8lfzj" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.381183 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6kzvm" podStartSLOduration=126.381159066 podStartE2EDuration="2m6.381159066s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.344220989 +0000 UTC m=+147.279910658" watchObservedRunningTime="2025-11-22 09:16:12.381159066 +0000 UTC m=+147.316848715" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.396334 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.411419 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:12.911401269 +0000 UTC m=+147.847090918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.430694 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jjv78" podStartSLOduration=126.430673441 podStartE2EDuration="2m6.430673441s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.428839817 +0000 UTC m=+147.364529476" watchObservedRunningTime="2025-11-22 09:16:12.430673441 +0000 UTC m=+147.366363090" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.431632 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" podStartSLOduration=126.431627819 podStartE2EDuration="2m6.431627819s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.383147594 +0000 UTC m=+147.318837253" watchObservedRunningTime="2025-11-22 09:16:12.431627819 +0000 UTC m=+147.367317468" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.434447 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:12 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:12 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:12 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.434506 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.486897 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-n2tpg" podStartSLOduration=126.48686449 podStartE2EDuration="2m6.48686449s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.483556324 +0000 UTC m=+147.419245993" watchObservedRunningTime="2025-11-22 09:16:12.48686449 +0000 UTC m=+147.422554149" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.498416 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.500739 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.000707484 +0000 UTC m=+147.936397133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.529713 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kgj2l" podStartSLOduration=126.52968879 podStartE2EDuration="2m6.52968879s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.529074962 +0000 UTC m=+147.464764631" watchObservedRunningTime="2025-11-22 09:16:12.52968879 +0000 UTC m=+147.465378439" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.585683 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" podStartSLOduration=127.585647682 podStartE2EDuration="2m7.585647682s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.582970984 +0000 UTC m=+147.518660633" watchObservedRunningTime="2025-11-22 09:16:12.585647682 +0000 UTC m=+147.521337331" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.600859 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.601423 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.101409122 +0000 UTC m=+148.037098771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.619450 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mhwdb" podStartSLOduration=8.619413017 podStartE2EDuration="8.619413017s" podCreationTimestamp="2025-11-22 09:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.616657927 +0000 UTC m=+147.552347576" watchObservedRunningTime="2025-11-22 09:16:12.619413017 +0000 UTC m=+147.555102666" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.702639 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.703000 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.202956675 +0000 UTC m=+148.138646314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.703813 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.704202 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.204193471 +0000 UTC m=+148.139883120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.719180 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2zbvb" podStartSLOduration=126.719158437 podStartE2EDuration="2m6.719158437s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.668735346 +0000 UTC m=+147.604425005" watchObservedRunningTime="2025-11-22 09:16:12.719158437 +0000 UTC m=+147.654848086" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.763367 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" podStartSLOduration=127.763344316 podStartE2EDuration="2m7.763344316s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.719695713 +0000 UTC m=+147.655385362" watchObservedRunningTime="2025-11-22 09:16:12.763344316 +0000 UTC m=+147.699033965" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.765064 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wd5kx" podStartSLOduration=8.765055306 podStartE2EDuration="8.765055306s" podCreationTimestamp="2025-11-22 09:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.762277755 +0000 UTC m=+147.697967404" watchObservedRunningTime="2025-11-22 09:16:12.765055306 +0000 UTC m=+147.700744955" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.800467 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" podStartSLOduration=126.800438749 podStartE2EDuration="2m6.800438749s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:12.793475626 +0000 UTC m=+147.729165275" watchObservedRunningTime="2025-11-22 09:16:12.800438749 +0000 UTC m=+147.736128398" Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.805994 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.806532 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.306507966 +0000 UTC m=+148.242197625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:12 crc kubenswrapper[4846]: I1122 09:16:12.907986 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:12 crc kubenswrapper[4846]: E1122 09:16:12.908531 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.408506861 +0000 UTC m=+148.344196510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.009966 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.011026 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.511002352 +0000 UTC m=+148.446692001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.112955 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.113635 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.613609685 +0000 UTC m=+148.549299514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.214928 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.215180 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.715139727 +0000 UTC m=+148.650829376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.215818 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.216232 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.716218529 +0000 UTC m=+148.651908178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.317117 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.317394 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.817350319 +0000 UTC m=+148.753039968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.317994 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.318414 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.81840088 +0000 UTC m=+148.754090529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.329873 4846 generic.go:334] "Generic (PLEG): container finished" podID="8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" containerID="898b8837d050f320f57d4e5faef53c05c2bd8a6a9e7ea86754c7389f5be690a4" exitCode=0 Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.329914 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" event={"ID":"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0","Type":"ContainerDied","Data":"898b8837d050f320f57d4e5faef53c05c2bd8a6a9e7ea86754c7389f5be690a4"} Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.331106 4846 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hmwp7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.331466 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.340515 4846 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg5p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.340561 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" podUID="22bb9a30-7380-4482-b556-57bed8a7d681" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.425701 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.427947 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:13.927917805 +0000 UTC m=+148.863607454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.456378 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:13 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:13 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:13 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.456470 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.528436 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.528899 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.02887528 +0000 UTC m=+148.964564929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.630163 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.630421 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.130383372 +0000 UTC m=+149.066073021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.630765 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.631289 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.131270528 +0000 UTC m=+149.066960177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.732214 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.732502 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.23246018 +0000 UTC m=+149.168149829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.733002 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.733419 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.233399717 +0000 UTC m=+149.169089366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.835424 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.835785 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.335746733 +0000 UTC m=+149.271436372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.836276 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.836747 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.336733032 +0000 UTC m=+149.272422681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.937913 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.938266 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.438231933 +0000 UTC m=+149.373921582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.938532 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.938653 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:13 crc kubenswrapper[4846]: E1122 09:16:13.939226 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.439201621 +0000 UTC m=+149.374891460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:13 crc kubenswrapper[4846]: I1122 09:16:13.939849 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.039695 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.040103 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.040176 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.040219 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.041658 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.54162787 +0000 UTC m=+149.477317519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.049087 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.053966 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.057865 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.058834 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.068783 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.077295 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.142224 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.143144 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.64311433 +0000 UTC m=+149.578803989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.157530 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwm66"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.158730 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.161269 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.194359 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwm66"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.243851 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.244106 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-catalog-content\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.244159 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-utilities\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.244217 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbptv\" (UniqueName: \"kubernetes.io/projected/1a7ef86c-1179-422c-90b7-c2e24e5687e9-kube-api-access-nbptv\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.244367 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.744345954 +0000 UTC m=+149.680035603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.310226 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rrd7k"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.315176 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.332001 4846 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tlxsw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.332097 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" podUID="e7797f70-8531-42a8-906f-b16e97b9aabc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.332586 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.347237 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-catalog-content\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.347301 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-utilities\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.347343 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbptv\" (UniqueName: \"kubernetes.io/projected/1a7ef86c-1179-422c-90b7-c2e24e5687e9-kube-api-access-nbptv\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.347372 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.347736 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.84772104 +0000 UTC m=+149.783410679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.348794 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-catalog-content\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.349175 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-utilities\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.396200 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbptv\" (UniqueName: \"kubernetes.io/projected/1a7ef86c-1179-422c-90b7-c2e24e5687e9-kube-api-access-nbptv\") pod \"community-operators-zwm66\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.442383 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:14 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:14 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:14 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.442482 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.449063 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.449375 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-utilities\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.449451 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mx6c\" (UniqueName: \"kubernetes.io/projected/2faa496d-af10-40eb-984b-1a67af462dbf-kube-api-access-6mx6c\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.449540 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-catalog-content\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.449670 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:14.949650823 +0000 UTC m=+149.885340472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.452022 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrd7k"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.461372 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.462124 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.469369 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.469682 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.483794 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.490731 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.546108 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rxk4c"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.550694 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e87d510e-b8fb-4809-9821-2da3eb357d7e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.550759 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-catalog-content\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.550793 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-utilities\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.550831 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.550861 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mx6c\" (UniqueName: \"kubernetes.io/projected/2faa496d-af10-40eb-984b-1a67af462dbf-kube-api-access-6mx6c\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.550882 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e87d510e-b8fb-4809-9821-2da3eb357d7e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.551410 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-catalog-content\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.551688 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-utilities\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.551836 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.551991 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.051976259 +0000 UTC m=+149.987665908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.570435 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxk4c"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.604796 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mx6c\" (UniqueName: \"kubernetes.io/projected/2faa496d-af10-40eb-984b-1a67af462dbf-kube-api-access-6mx6c\") pod \"certified-operators-rrd7k\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.653251 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.667215 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e87d510e-b8fb-4809-9821-2da3eb357d7e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.667385 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2hrd\" (UniqueName: \"kubernetes.io/projected/8b48034e-c5ec-4455-8e2b-f287119ee9aa-kube-api-access-x2hrd\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.667507 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-catalog-content\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.667614 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e87d510e-b8fb-4809-9821-2da3eb357d7e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.667668 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-utilities\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.668194 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e87d510e-b8fb-4809-9821-2da3eb357d7e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.668341 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.168290922 +0000 UTC m=+150.103980571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.717297 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.723894 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e87d510e-b8fb-4809-9821-2da3eb357d7e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.728596 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vxjp"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.740504 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.760359 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vxjp"] Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.769143 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2hrd\" (UniqueName: \"kubernetes.io/projected/8b48034e-c5ec-4455-8e2b-f287119ee9aa-kube-api-access-x2hrd\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.769227 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-catalog-content\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.769271 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.769325 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-utilities\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.769853 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-utilities\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.770824 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-catalog-content\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.785072 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.285026998 +0000 UTC m=+150.220716647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.829267 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2hrd\" (UniqueName: \"kubernetes.io/projected/8b48034e-c5ec-4455-8e2b-f287119ee9aa-kube-api-access-x2hrd\") pod \"community-operators-rxk4c\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.873698 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.874134 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87fl\" (UniqueName: \"kubernetes.io/projected/c642ad0c-4276-4e16-af01-7cb25b6eec61-kube-api-access-d87fl\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.874164 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-catalog-content\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.874214 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-utilities\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.874357 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.374337153 +0000 UTC m=+150.310026802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.976267 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-utilities\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.976381 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.976432 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d87fl\" (UniqueName: \"kubernetes.io/projected/c642ad0c-4276-4e16-af01-7cb25b6eec61-kube-api-access-d87fl\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.976462 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-catalog-content\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.977669 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-catalog-content\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.977985 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-utilities\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:14 crc kubenswrapper[4846]: E1122 09:16:14.978353 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.478338347 +0000 UTC m=+150.414027996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:14 crc kubenswrapper[4846]: I1122 09:16:14.991524 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.014823 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.031174 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87fl\" (UniqueName: \"kubernetes.io/projected/c642ad0c-4276-4e16-af01-7cb25b6eec61-kube-api-access-d87fl\") pod \"certified-operators-7vxjp\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.080234 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.080666 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.580625152 +0000 UTC m=+150.516314791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.080782 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.081105 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.581097055 +0000 UTC m=+150.516786704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.098514 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.182420 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.182877 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.682815633 +0000 UTC m=+150.618505292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.182968 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.183418 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.68340147 +0000 UTC m=+150.619091119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.284202 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.284416 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.784375936 +0000 UTC m=+150.720065585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.285033 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.285471 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.785456947 +0000 UTC m=+150.721146596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: W1122 09:16:15.367451 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-183a9834835f7c3493440fc23dd52c06ce501e213f1684778a6b14d774879204 WatchSource:0}: Error finding container 183a9834835f7c3493440fc23dd52c06ce501e213f1684778a6b14d774879204: Status 404 returned error can't find the container with id 183a9834835f7c3493440fc23dd52c06ce501e213f1684778a6b14d774879204 Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.369848 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b36e97206e68b3788b44ba5dc0054d115538f7631beee87f2601229a5747c035"} Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.376172 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"52fb1a06c67f51a42fcc2e40fb82b73ab9ed67a4c3fbde00d22c023c133d6e8f"} Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.387787 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.388279 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.888254356 +0000 UTC m=+150.823943995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.431317 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:15 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:15 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:15 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.431392 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.490253 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.490296 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.490689 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:15.990674894 +0000 UTC m=+150.926364543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.501848 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg5p" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.591289 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-config-volume\") pod \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.591432 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.591489 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfbx6\" (UniqueName: \"kubernetes.io/projected/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-kube-api-access-rfbx6\") pod \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.591609 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-secret-volume\") pod \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\" (UID: \"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.596121 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.096075349 +0000 UTC m=+151.031764998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.599325 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" (UID: "8a54ed2d-f7cd-440a-86c3-4c82ce070ac0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.627267 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" (UID: "8a54ed2d-f7cd-440a-86c3-4c82ce070ac0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.633792 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-kube-api-access-rfbx6" (OuterVolumeSpecName: "kube-api-access-rfbx6") pod "8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" (UID: "8a54ed2d-f7cd-440a-86c3-4c82ce070ac0"). InnerVolumeSpecName "kube-api-access-rfbx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.641574 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwm66"] Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.668204 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rrd7k"] Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.694336 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.694444 4846 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.694460 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.694471 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfbx6\" (UniqueName: \"kubernetes.io/projected/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0-kube-api-access-rfbx6\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.694836 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.194817 +0000 UTC m=+151.130506649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.703077 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxk4c"] Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.764833 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vxjp"] Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.798118 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.798666 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.298639289 +0000 UTC m=+151.234328938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.799005 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.799476 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.299465513 +0000 UTC m=+151.235155162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.853810 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 09:16:15 crc kubenswrapper[4846]: W1122 09:16:15.876669 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode87d510e_b8fb_4809_9821_2da3eb357d7e.slice/crio-395dd9ac13fbcde3cfa6804ea96b972bc29e91d252790d05ab6299d5260c46d3 WatchSource:0}: Error finding container 395dd9ac13fbcde3cfa6804ea96b972bc29e91d252790d05ab6299d5260c46d3: Status 404 returned error can't find the container with id 395dd9ac13fbcde3cfa6804ea96b972bc29e91d252790d05ab6299d5260c46d3 Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.900377 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:15 crc kubenswrapper[4846]: E1122 09:16:15.901374 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.401325744 +0000 UTC m=+151.337015393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:15 crc kubenswrapper[4846]: I1122 09:16:15.947276 4846 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.002216 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:16 crc kubenswrapper[4846]: E1122 09:16:16.002638 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.502622959 +0000 UTC m=+151.438312608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.105576 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:16 crc kubenswrapper[4846]: E1122 09:16:16.105922 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.605862211 +0000 UTC m=+151.541551860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.106092 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:16 crc kubenswrapper[4846]: E1122 09:16:16.106754 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.606743397 +0000 UTC m=+151.542433046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nfrf8" (UID: "92822bda-884a-4bfc-b651-f58624599346") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.207066 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:16 crc kubenswrapper[4846]: E1122 09:16:16.207472 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 09:16:16.707444775 +0000 UTC m=+151.643134424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.270675 4846 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-22T09:16:15.947717757Z","Handler":null,"Name":""} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.273942 4846 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.273988 4846 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.288396 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qplc2"] Nov 22 09:16:16 crc kubenswrapper[4846]: E1122 09:16:16.288899 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" containerName="collect-profiles" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.289015 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" containerName="collect-profiles" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.289253 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" containerName="collect-profiles" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.291155 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.295937 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.300834 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qplc2"] Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.312442 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.318470 4846 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.318536 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.372623 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nfrf8\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.394115 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fef493cf8c4ceab9478f591b8f606e1ceb81162833698af03a992bc406e6b08b"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.400359 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e87d510e-b8fb-4809-9821-2da3eb357d7e","Type":"ContainerStarted","Data":"7cd8d5fdabe2bb6c4c65a263e2f7129bd662a3d791309c11618a6e789173e89d"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.400431 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e87d510e-b8fb-4809-9821-2da3eb357d7e","Type":"ContainerStarted","Data":"395dd9ac13fbcde3cfa6804ea96b972bc29e91d252790d05ab6299d5260c46d3"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.403825 4846 generic.go:334] "Generic (PLEG): container finished" podID="2faa496d-af10-40eb-984b-1a67af462dbf" containerID="b1ef4cc77ac4663d8c91a9b5ac841855bccb4a5051969a34c80ea88dc1ef5f9c" exitCode=0 Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.403971 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerDied","Data":"b1ef4cc77ac4663d8c91a9b5ac841855bccb4a5051969a34c80ea88dc1ef5f9c"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.404007 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerStarted","Data":"c22da52be887008f7d93d1dcc8ff897300dc733f24d02dbfd9245c002c7ef774"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.412408 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.413678 4846 generic.go:334] "Generic (PLEG): container finished" podID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerID="e4b01192a1a9e26366ede83d1d199bfc371ddeb30b5a974e112f367f1abcf317" exitCode=0 Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.413757 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerDied","Data":"e4b01192a1a9e26366ede83d1d199bfc371ddeb30b5a974e112f367f1abcf317"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.413797 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerStarted","Data":"0fa2a99b7863ff12b9886fc5202b0e8cdfafd8390ce92bf34a015b5c78ebb845"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.414152 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.414424 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-kube-api-access-t2llq\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.414482 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-utilities\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.414579 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-catalog-content\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.438795 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.439200 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f6d0b2b1ab5e92e2a389d0b42e4d829edbbb087eae84215cb3c70e368bf595cb"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.439644 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.440505 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:16 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:16 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:16 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.440542 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.441294 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ab6a3f91d4176c2b9b8eb63a852a840658dd7778a3bb19598bbf561eef5b6913"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.441322 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"183a9834835f7c3493440fc23dd52c06ce501e213f1684778a6b14d774879204"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.449390 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" event={"ID":"73f80696-3504-4a89-9681-4925daceb257","Type":"ContainerStarted","Data":"4d324b46455216f6b4c2befd7ac6db93ce6e515f896661bc3643337e6c1cfbc1"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.449468 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" event={"ID":"73f80696-3504-4a89-9681-4925daceb257","Type":"ContainerStarted","Data":"1b2cbe8f05110517d44e4c70e6c92bf21aa00a064a287944f0af3b354dc87dd5"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.451207 4846 generic.go:334] "Generic (PLEG): container finished" podID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerID="53f24d5605834a9b4de54e274362ad367ef1131956f9e21e51b23ce5a0fe5b82" exitCode=0 Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.451291 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerDied","Data":"53f24d5605834a9b4de54e274362ad367ef1131956f9e21e51b23ce5a0fe5b82"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.451317 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerStarted","Data":"0f1ebbd62ba8395b839d2aac068e2733e6520cf00c554fd2eb3238d1c19c817d"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.451648 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.451610658 podStartE2EDuration="2.451610658s" podCreationTimestamp="2025-11-22 09:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:16.43863461 +0000 UTC m=+151.374324259" watchObservedRunningTime="2025-11-22 09:16:16.451610658 +0000 UTC m=+151.387300337" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.457437 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" event={"ID":"8a54ed2d-f7cd-440a-86c3-4c82ce070ac0","Type":"ContainerDied","Data":"a9d090d770f6db273e0348a40b8005a339c36ba4486f5cf25c8dad6dace03b08"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.457496 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d090d770f6db273e0348a40b8005a339c36ba4486f5cf25c8dad6dace03b08" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.457659 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.459964 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.464874 4846 generic.go:334] "Generic (PLEG): container finished" podID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerID="b84c1ea5a52b237ad064e938798f2be8ec8f62f481cd8fa62917683a39569c4e" exitCode=0 Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.465246 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerDied","Data":"b84c1ea5a52b237ad064e938798f2be8ec8f62f481cd8fa62917683a39569c4e"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.465292 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerStarted","Data":"c82803f03abe3b3f19946548201b45a5e0ce12cd87418130f3cecf6c938d091b"} Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.517007 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-catalog-content\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.525876 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-kube-api-access-t2llq\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.526195 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-utilities\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.523706 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-catalog-content\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.536260 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-utilities\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.570740 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-kube-api-access-t2llq\") pod \"redhat-marketplace-qplc2\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.622522 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zdpb4" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.630319 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.630732 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.636005 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.676814 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.677416 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.690096 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fkgwl"] Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.692716 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.697021 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.715547 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkgwl"] Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.779221 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.780589 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.796304 4846 patch_prober.go:28] interesting pod/console-f9d7485db-k86mj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.796380 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k86mj" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.807208 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.807239 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.807272 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.807289 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.834665 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-utilities\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.834742 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf5js\" (UniqueName: \"kubernetes.io/projected/37a85584-057d-4aa8-a753-20600a8f4bab-kube-api-access-mf5js\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.834804 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-catalog-content\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.936322 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-catalog-content\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.936551 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-utilities\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.936606 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf5js\" (UniqueName: \"kubernetes.io/projected/37a85584-057d-4aa8-a753-20600a8f4bab-kube-api-access-mf5js\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.938732 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-catalog-content\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.938925 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-utilities\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.984191 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf5js\" (UniqueName: \"kubernetes.io/projected/37a85584-057d-4aa8-a753-20600a8f4bab-kube-api-access-mf5js\") pod \"redhat-marketplace-fkgwl\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:16 crc kubenswrapper[4846]: I1122 09:16:16.990640 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nfrf8"] Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.031885 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.049830 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qplc2"] Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.195142 4846 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wgnrq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]log ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]etcd ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/generic-apiserver-start-informers ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/max-in-flight-filter ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 22 09:16:17 crc kubenswrapper[4846]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 22 09:16:17 crc kubenswrapper[4846]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/project.openshift.io-projectcache ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/openshift.io-startinformers ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 22 09:16:17 crc kubenswrapper[4846]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 22 09:16:17 crc kubenswrapper[4846]: livez check failed Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.195692 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" podUID="b961dfe4-8e3d-4cf8-8032-2293ea7240fe" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.304372 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8mn4j"] Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.305455 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.314424 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.324205 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8mn4j"] Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.426311 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.430587 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:17 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:17 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:17 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.431002 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.472550 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkgwl"] Nov 22 09:16:17 crc kubenswrapper[4846]: E1122 09:16:17.482561 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee01bf13_f20b_4778_83bc_ccbb4fa78da4.slice/crio-conmon-040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af.scope\": RecentStats: unable to find data in memory cache]" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.496399 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-catalog-content\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.496608 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lg8g\" (UniqueName: \"kubernetes.io/projected/c0da74d6-4145-4e5d-ac01-a5df2289d427-kube-api-access-2lg8g\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.496813 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-utilities\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.502032 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" event={"ID":"73f80696-3504-4a89-9681-4925daceb257","Type":"ContainerStarted","Data":"0f4f58a35cfd98dd75d9412cd71b983987a97dbf9c7088bdbb9d5f7322b4da01"} Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.511476 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.514727 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerID="040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af" exitCode=0 Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.514833 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qplc2" event={"ID":"ee01bf13-f20b-4778-83bc-ccbb4fa78da4","Type":"ContainerDied","Data":"040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af"} Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.514875 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qplc2" event={"ID":"ee01bf13-f20b-4778-83bc-ccbb4fa78da4","Type":"ContainerStarted","Data":"555a7776bfb9c5e7a1872cd596d31f311f06b2b289d0f04b1bacef1ed233b0a1"} Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.534678 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" event={"ID":"92822bda-884a-4bfc-b651-f58624599346","Type":"ContainerStarted","Data":"77932a7e98b3339e204a32539f23d75b76d8eaae2400997125a7ab88f76aa9a1"} Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.534724 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" event={"ID":"92822bda-884a-4bfc-b651-f58624599346","Type":"ContainerStarted","Data":"f811e52055a7ead2989ef643901f3531923566c22d00322ae616c67e6b33fcca"} Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.535693 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.552306 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tfpk7" podStartSLOduration=13.552270437 podStartE2EDuration="13.552270437s" podCreationTimestamp="2025-11-22 09:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:17.52563807 +0000 UTC m=+152.461327719" watchObservedRunningTime="2025-11-22 09:16:17.552270437 +0000 UTC m=+152.487960086" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.562923 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e87d510e-b8fb-4809-9821-2da3eb357d7e","Type":"ContainerDied","Data":"7cd8d5fdabe2bb6c4c65a263e2f7129bd662a3d791309c11618a6e789173e89d"} Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.562885 4846 generic.go:334] "Generic (PLEG): container finished" podID="e87d510e-b8fb-4809-9821-2da3eb357d7e" containerID="7cd8d5fdabe2bb6c4c65a263e2f7129bd662a3d791309c11618a6e789173e89d" exitCode=0 Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.593287 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tlxsw" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.595407 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zxtgf" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.597872 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-utilities\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.598089 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-catalog-content\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.598124 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lg8g\" (UniqueName: \"kubernetes.io/projected/c0da74d6-4145-4e5d-ac01-a5df2289d427-kube-api-access-2lg8g\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.599591 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-utilities\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.601783 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-catalog-content\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.651933 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lg8g\" (UniqueName: \"kubernetes.io/projected/c0da74d6-4145-4e5d-ac01-a5df2289d427-kube-api-access-2lg8g\") pod \"redhat-operators-8mn4j\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.702870 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" podStartSLOduration=131.70284826 podStartE2EDuration="2m11.70284826s" podCreationTimestamp="2025-11-22 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:17.702118799 +0000 UTC m=+152.637808458" watchObservedRunningTime="2025-11-22 09:16:17.70284826 +0000 UTC m=+152.638537909" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.773284 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9cktz"] Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.774679 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.780136 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cktz"] Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.902524 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-catalog-content\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.903143 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9dx\" (UniqueName: \"kubernetes.io/projected/45f10022-ca14-42a8-bb6f-e28b7df2da4e-kube-api-access-wg9dx\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.903229 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-utilities\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:17 crc kubenswrapper[4846]: I1122 09:16:17.936529 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.005233 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9dx\" (UniqueName: \"kubernetes.io/projected/45f10022-ca14-42a8-bb6f-e28b7df2da4e-kube-api-access-wg9dx\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.005358 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-utilities\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.005434 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-catalog-content\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.006453 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-catalog-content\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.007063 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-utilities\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.033462 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9dx\" (UniqueName: \"kubernetes.io/projected/45f10022-ca14-42a8-bb6f-e28b7df2da4e-kube-api-access-wg9dx\") pod \"redhat-operators-9cktz\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.044343 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.111092 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.435968 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:18 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:18 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:18 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.436512 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.511549 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9cktz"] Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.577433 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerStarted","Data":"9f9ca63eb3a9b34daffe23327dc18c4175e2837d5c9d6d8827fcaf69678ebdfb"} Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.581611 4846 generic.go:334] "Generic (PLEG): container finished" podID="37a85584-057d-4aa8-a753-20600a8f4bab" containerID="a0c747a19d6c982bfb57a079491a8ce3ddb5d85db641d23ae09026e08f1b0ede" exitCode=0 Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.583775 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkgwl" event={"ID":"37a85584-057d-4aa8-a753-20600a8f4bab","Type":"ContainerDied","Data":"a0c747a19d6c982bfb57a079491a8ce3ddb5d85db641d23ae09026e08f1b0ede"} Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.583832 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkgwl" event={"ID":"37a85584-057d-4aa8-a753-20600a8f4bab","Type":"ContainerStarted","Data":"eac658e78d7c874db0c7741ecbeb6108cee98e763dcfbb57e30a0ee567dacb99"} Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.594466 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8mn4j"] Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.832281 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.927262 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e87d510e-b8fb-4809-9821-2da3eb357d7e-kubelet-dir\") pod \"e87d510e-b8fb-4809-9821-2da3eb357d7e\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.927547 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e87d510e-b8fb-4809-9821-2da3eb357d7e-kube-api-access\") pod \"e87d510e-b8fb-4809-9821-2da3eb357d7e\" (UID: \"e87d510e-b8fb-4809-9821-2da3eb357d7e\") " Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.928257 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e87d510e-b8fb-4809-9821-2da3eb357d7e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e87d510e-b8fb-4809-9821-2da3eb357d7e" (UID: "e87d510e-b8fb-4809-9821-2da3eb357d7e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.928847 4846 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e87d510e-b8fb-4809-9821-2da3eb357d7e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:18 crc kubenswrapper[4846]: I1122 09:16:18.938512 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87d510e-b8fb-4809-9821-2da3eb357d7e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e87d510e-b8fb-4809-9821-2da3eb357d7e" (UID: "e87d510e-b8fb-4809-9821-2da3eb357d7e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.031318 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e87d510e-b8fb-4809-9821-2da3eb357d7e-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.430333 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:19 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:19 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:19 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.430445 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.565753 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 09:16:19 crc kubenswrapper[4846]: E1122 09:16:19.566081 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87d510e-b8fb-4809-9821-2da3eb357d7e" containerName="pruner" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.566098 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87d510e-b8fb-4809-9821-2da3eb357d7e" containerName="pruner" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.566239 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87d510e-b8fb-4809-9821-2da3eb357d7e" containerName="pruner" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.569896 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.572296 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.572363 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.576902 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.610558 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.610718 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e87d510e-b8fb-4809-9821-2da3eb357d7e","Type":"ContainerDied","Data":"395dd9ac13fbcde3cfa6804ea96b972bc29e91d252790d05ab6299d5260c46d3"} Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.610789 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="395dd9ac13fbcde3cfa6804ea96b972bc29e91d252790d05ab6299d5260c46d3" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.624141 4846 generic.go:334] "Generic (PLEG): container finished" podID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerID="d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497" exitCode=0 Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.624224 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerDied","Data":"d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497"} Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.627401 4846 generic.go:334] "Generic (PLEG): container finished" podID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerID="e1d37d317cbef7a42b928be510127bdc51103001b4bd3830b4f6602ec97bcaa6" exitCode=0 Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.627670 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerDied","Data":"e1d37d317cbef7a42b928be510127bdc51103001b4bd3830b4f6602ec97bcaa6"} Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.627737 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerStarted","Data":"7a96238c807168d88b8ef041ef7b4f3b9c7278e2d35668064cffea1340d59a3e"} Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.642205 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.642258 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.744289 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.744598 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.747080 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.770637 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:19 crc kubenswrapper[4846]: I1122 09:16:19.897669 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:20 crc kubenswrapper[4846]: I1122 09:16:20.261623 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 09:16:20 crc kubenswrapper[4846]: I1122 09:16:20.436178 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:20 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:20 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:20 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:20 crc kubenswrapper[4846]: I1122 09:16:20.436281 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:20 crc kubenswrapper[4846]: I1122 09:16:20.662968 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf","Type":"ContainerStarted","Data":"a5cca2437765a0516441073e42940e6f6cb70ecfa9a830991aca26f4ff7b4a88"} Nov 22 09:16:21 crc kubenswrapper[4846]: I1122 09:16:21.428549 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:21 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:21 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:21 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:21 crc kubenswrapper[4846]: I1122 09:16:21.428615 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:21 crc kubenswrapper[4846]: I1122 09:16:21.635266 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:21 crc kubenswrapper[4846]: I1122 09:16:21.640806 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wgnrq" Nov 22 09:16:22 crc kubenswrapper[4846]: I1122 09:16:22.426864 4846 patch_prober.go:28] interesting pod/router-default-5444994796-tzhcp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 09:16:22 crc kubenswrapper[4846]: [-]has-synced failed: reason withheld Nov 22 09:16:22 crc kubenswrapper[4846]: [+]process-running ok Nov 22 09:16:22 crc kubenswrapper[4846]: healthz check failed Nov 22 09:16:22 crc kubenswrapper[4846]: I1122 09:16:22.426942 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tzhcp" podUID="c7a523a9-8ee9-4bab-8baa-ac393331dd07" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 09:16:22 crc kubenswrapper[4846]: I1122 09:16:22.591356 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mhwdb" Nov 22 09:16:22 crc kubenswrapper[4846]: I1122 09:16:22.700926 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf","Type":"ContainerStarted","Data":"dd1e4913300daf9212cafecaa42cde8da82f0ae15db9bb43437a05d9ca15d103"} Nov 22 09:16:22 crc kubenswrapper[4846]: I1122 09:16:22.723213 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.723190724 podStartE2EDuration="3.723190724s" podCreationTimestamp="2025-11-22 09:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:16:22.719720043 +0000 UTC m=+157.655409712" watchObservedRunningTime="2025-11-22 09:16:22.723190724 +0000 UTC m=+157.658880373" Nov 22 09:16:23 crc kubenswrapper[4846]: I1122 09:16:23.429175 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:23 crc kubenswrapper[4846]: I1122 09:16:23.431806 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tzhcp" Nov 22 09:16:23 crc kubenswrapper[4846]: I1122 09:16:23.715196 4846 generic.go:334] "Generic (PLEG): container finished" podID="8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf" containerID="dd1e4913300daf9212cafecaa42cde8da82f0ae15db9bb43437a05d9ca15d103" exitCode=0 Nov 22 09:16:23 crc kubenswrapper[4846]: I1122 09:16:23.717268 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf","Type":"ContainerDied","Data":"dd1e4913300daf9212cafecaa42cde8da82f0ae15db9bb43437a05d9ca15d103"} Nov 22 09:16:26 crc kubenswrapper[4846]: I1122 09:16:26.771820 4846 patch_prober.go:28] interesting pod/console-f9d7485db-k86mj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 09:16:26 crc kubenswrapper[4846]: I1122 09:16:26.772181 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k86mj" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 09:16:26 crc kubenswrapper[4846]: I1122 09:16:26.805242 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:26 crc kubenswrapper[4846]: I1122 09:16:26.805303 4846 patch_prober.go:28] interesting pod/downloads-7954f5f757-n5xpn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Nov 22 09:16:26 crc kubenswrapper[4846]: I1122 09:16:26.805340 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:26 crc kubenswrapper[4846]: I1122 09:16:26.805374 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n5xpn" podUID="6cc4154d-473a-46cf-acf2-6978d0e642ee" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Nov 22 09:16:27 crc kubenswrapper[4846]: I1122 09:16:27.849760 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:16:27 crc kubenswrapper[4846]: I1122 09:16:27.858125 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e79bf3c4-87ae-4009-9a11-d26130912fef-metrics-certs\") pod \"network-metrics-daemon-79xpm\" (UID: \"e79bf3c4-87ae-4009-9a11-d26130912fef\") " pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:16:27 crc kubenswrapper[4846]: I1122 09:16:27.978255 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-79xpm" Nov 22 09:16:28 crc kubenswrapper[4846]: I1122 09:16:28.626437 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:16:28 crc kubenswrapper[4846]: I1122 09:16:28.627000 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.262660 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.319240 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kubelet-dir\") pod \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.319353 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kube-api-access\") pod \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\" (UID: \"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf\") " Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.319408 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf" (UID: "8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.319753 4846 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.326591 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf" (UID: "8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.422382 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.788234 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf","Type":"ContainerDied","Data":"a5cca2437765a0516441073e42940e6f6cb70ecfa9a830991aca26f4ff7b4a88"} Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.788289 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5cca2437765a0516441073e42940e6f6cb70ecfa9a830991aca26f4ff7b4a88" Nov 22 09:16:31 crc kubenswrapper[4846]: I1122 09:16:31.788290 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 09:16:36 crc kubenswrapper[4846]: I1122 09:16:36.052930 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-79xpm"] Nov 22 09:16:36 crc kubenswrapper[4846]: I1122 09:16:36.467552 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:16:36 crc kubenswrapper[4846]: I1122 09:16:36.776859 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:36 crc kubenswrapper[4846]: I1122 09:16:36.781328 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:16:36 crc kubenswrapper[4846]: I1122 09:16:36.812967 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-n5xpn" Nov 22 09:16:47 crc kubenswrapper[4846]: I1122 09:16:47.538541 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv65w" Nov 22 09:16:52 crc kubenswrapper[4846]: W1122 09:16:52.388547 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode79bf3c4_87ae_4009_9a11_d26130912fef.slice/crio-539d5e417feb121c7f224b42b0e592810c90875e6cbd5dcb3160e01d89f08730 WatchSource:0}: Error finding container 539d5e417feb121c7f224b42b0e592810c90875e6cbd5dcb3160e01d89f08730: Status 404 returned error can't find the container with id 539d5e417feb121c7f224b42b0e592810c90875e6cbd5dcb3160e01d89f08730 Nov 22 09:16:52 crc kubenswrapper[4846]: E1122 09:16:52.424432 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 09:16:52 crc kubenswrapper[4846]: E1122 09:16:52.424679 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2hrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rxk4c_openshift-marketplace(8b48034e-c5ec-4455-8e2b-f287119ee9aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:52 crc kubenswrapper[4846]: E1122 09:16:52.425852 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rxk4c" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" Nov 22 09:16:52 crc kubenswrapper[4846]: I1122 09:16:52.920515 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-79xpm" event={"ID":"e79bf3c4-87ae-4009-9a11-d26130912fef","Type":"ContainerStarted","Data":"539d5e417feb121c7f224b42b0e592810c90875e6cbd5dcb3160e01d89f08730"} Nov 22 09:16:54 crc kubenswrapper[4846]: E1122 09:16:54.020509 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 09:16:54 crc kubenswrapper[4846]: E1122 09:16:54.020720 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d87fl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7vxjp_openshift-marketplace(c642ad0c-4276-4e16-af01-7cb25b6eec61): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:54 crc kubenswrapper[4846]: E1122 09:16:54.022157 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7vxjp" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" Nov 22 09:16:54 crc kubenswrapper[4846]: I1122 09:16:54.128154 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 09:16:54 crc kubenswrapper[4846]: E1122 09:16:54.684891 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 09:16:54 crc kubenswrapper[4846]: E1122 09:16:54.685095 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nbptv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zwm66_openshift-marketplace(1a7ef86c-1179-422c-90b7-c2e24e5687e9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:54 crc kubenswrapper[4846]: E1122 09:16:54.687886 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zwm66" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" Nov 22 09:16:57 crc kubenswrapper[4846]: E1122 09:16:57.227675 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 09:16:57 crc kubenswrapper[4846]: E1122 09:16:57.228030 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2llq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qplc2_openshift-marketplace(ee01bf13-f20b-4778-83bc-ccbb4fa78da4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:57 crc kubenswrapper[4846]: E1122 09:16:57.229270 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qplc2" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" Nov 22 09:16:58 crc kubenswrapper[4846]: I1122 09:16:58.625747 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:16:58 crc kubenswrapper[4846]: I1122 09:16:58.626478 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.021694 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rxk4c" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.021787 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zwm66" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.022212 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7vxjp" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.022460 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qplc2" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.149148 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.149380 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mf5js,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fkgwl_openshift-marketplace(37a85584-057d-4aa8-a753-20600a8f4bab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.150848 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fkgwl" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.284277 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.284482 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg9dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9cktz_openshift-marketplace(45f10022-ca14-42a8-bb6f-e28b7df2da4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.286693 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9cktz" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.400113 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.400352 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2lg8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8mn4j_openshift-marketplace(c0da74d6-4145-4e5d-ac01-a5df2289d427): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.403227 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8mn4j" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" Nov 22 09:16:59 crc kubenswrapper[4846]: I1122 09:16:59.975694 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-79xpm" event={"ID":"e79bf3c4-87ae-4009-9a11-d26130912fef","Type":"ContainerStarted","Data":"d133ae3cff66aebaaa9617301b020d6f024e99cdacd92c7de5e4ded5d9d01b04"} Nov 22 09:16:59 crc kubenswrapper[4846]: I1122 09:16:59.979060 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerStarted","Data":"4fe418646a389460a345c63661a36b27f976c7fedff34b0cb19bb9ae9e725c64"} Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.980853 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8mn4j" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.981216 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fkgwl" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" Nov 22 09:16:59 crc kubenswrapper[4846]: E1122 09:16:59.981788 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9cktz" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" Nov 22 09:17:00 crc kubenswrapper[4846]: I1122 09:17:00.988329 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-79xpm" event={"ID":"e79bf3c4-87ae-4009-9a11-d26130912fef","Type":"ContainerStarted","Data":"f360a67725107f34b1fe1e7a38dc6bd07ad14f0a10d2a686019755e76c087a37"} Nov 22 09:17:00 crc kubenswrapper[4846]: I1122 09:17:00.990338 4846 generic.go:334] "Generic (PLEG): container finished" podID="2faa496d-af10-40eb-984b-1a67af462dbf" containerID="4fe418646a389460a345c63661a36b27f976c7fedff34b0cb19bb9ae9e725c64" exitCode=0 Nov 22 09:17:00 crc kubenswrapper[4846]: I1122 09:17:00.990378 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerDied","Data":"4fe418646a389460a345c63661a36b27f976c7fedff34b0cb19bb9ae9e725c64"} Nov 22 09:17:02 crc kubenswrapper[4846]: I1122 09:17:02.012944 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-79xpm" podStartSLOduration=177.012924703 podStartE2EDuration="2m57.012924703s" podCreationTimestamp="2025-11-22 09:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:17:02.01283411 +0000 UTC m=+196.948523769" watchObservedRunningTime="2025-11-22 09:17:02.012924703 +0000 UTC m=+196.948614352" Nov 22 09:17:04 crc kubenswrapper[4846]: I1122 09:17:04.009342 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerStarted","Data":"498b8d378e718bb4066a67aa10277ce33ab4ac612dec67f38b1f7208a5cee211"} Nov 22 09:17:04 crc kubenswrapper[4846]: I1122 09:17:04.030796 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rrd7k" podStartSLOduration=3.5516374429999997 podStartE2EDuration="50.030769443s" podCreationTimestamp="2025-11-22 09:16:14 +0000 UTC" firstStartedPulling="2025-11-22 09:16:16.412011483 +0000 UTC m=+151.347701132" lastFinishedPulling="2025-11-22 09:17:02.891143483 +0000 UTC m=+197.826833132" observedRunningTime="2025-11-22 09:17:04.027014416 +0000 UTC m=+198.962704065" watchObservedRunningTime="2025-11-22 09:17:04.030769443 +0000 UTC m=+198.966459122" Nov 22 09:17:04 crc kubenswrapper[4846]: I1122 09:17:04.718413 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:17:04 crc kubenswrapper[4846]: I1122 09:17:04.719074 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:17:04 crc kubenswrapper[4846]: I1122 09:17:04.889821 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:17:12 crc kubenswrapper[4846]: I1122 09:17:12.076174 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerStarted","Data":"bb6b6b96db00b419ce5d373482d9cdd8c611f6f241b4cb73cbd2022750b803b1"} Nov 22 09:17:13 crc kubenswrapper[4846]: I1122 09:17:13.089029 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerStarted","Data":"d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba"} Nov 22 09:17:13 crc kubenswrapper[4846]: I1122 09:17:13.096884 4846 generic.go:334] "Generic (PLEG): container finished" podID="37a85584-057d-4aa8-a753-20600a8f4bab" containerID="f7fc8e50b762894e80ff0d3085533748af9680f24c9269777b722ae928c31d18" exitCode=0 Nov 22 09:17:13 crc kubenswrapper[4846]: I1122 09:17:13.097060 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkgwl" event={"ID":"37a85584-057d-4aa8-a753-20600a8f4bab","Type":"ContainerDied","Data":"f7fc8e50b762894e80ff0d3085533748af9680f24c9269777b722ae928c31d18"} Nov 22 09:17:13 crc kubenswrapper[4846]: I1122 09:17:13.113599 4846 generic.go:334] "Generic (PLEG): container finished" podID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerID="bb6b6b96db00b419ce5d373482d9cdd8c611f6f241b4cb73cbd2022750b803b1" exitCode=0 Nov 22 09:17:13 crc kubenswrapper[4846]: I1122 09:17:13.113689 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerDied","Data":"bb6b6b96db00b419ce5d373482d9cdd8c611f6f241b4cb73cbd2022750b803b1"} Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.122364 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerStarted","Data":"1111e8627056968b78bdb076952ee80fe1503376ed228b810aa9143bef239697"} Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.124508 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerStarted","Data":"f8843ae37042c98afbc71f3782171a798d9483a84e75e0812504fed73d50ffe7"} Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.127833 4846 generic.go:334] "Generic (PLEG): container finished" podID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerID="d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba" exitCode=0 Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.127882 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerDied","Data":"d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba"} Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.136806 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerStarted","Data":"aa04e1af7cf6179fa3a4ce5dcaeaa923541127719e196bc60a7cd78115f9b457"} Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.138988 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkgwl" event={"ID":"37a85584-057d-4aa8-a753-20600a8f4bab","Type":"ContainerStarted","Data":"b915165b25430ccf239c8dff041a58803a8c94719136498a5e6b78e95213f3c9"} Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.148789 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8mn4j" podStartSLOduration=3.2300419639999998 podStartE2EDuration="57.148751226s" podCreationTimestamp="2025-11-22 09:16:17 +0000 UTC" firstStartedPulling="2025-11-22 09:16:19.633498544 +0000 UTC m=+154.569188193" lastFinishedPulling="2025-11-22 09:17:13.552207806 +0000 UTC m=+208.487897455" observedRunningTime="2025-11-22 09:17:14.145761483 +0000 UTC m=+209.081451132" watchObservedRunningTime="2025-11-22 09:17:14.148751226 +0000 UTC m=+209.084440875" Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.192012 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fkgwl" podStartSLOduration=3.187654473 podStartE2EDuration="58.191954518s" podCreationTimestamp="2025-11-22 09:16:16 +0000 UTC" firstStartedPulling="2025-11-22 09:16:18.650467366 +0000 UTC m=+153.586157015" lastFinishedPulling="2025-11-22 09:17:13.654767411 +0000 UTC m=+208.590457060" observedRunningTime="2025-11-22 09:17:14.186487238 +0000 UTC m=+209.122176907" watchObservedRunningTime="2025-11-22 09:17:14.191954518 +0000 UTC m=+209.127644157" Nov 22 09:17:14 crc kubenswrapper[4846]: I1122 09:17:14.794174 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:17:15 crc kubenswrapper[4846]: I1122 09:17:15.159994 4846 generic.go:334] "Generic (PLEG): container finished" podID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerID="aa04e1af7cf6179fa3a4ce5dcaeaa923541127719e196bc60a7cd78115f9b457" exitCode=0 Nov 22 09:17:15 crc kubenswrapper[4846]: I1122 09:17:15.160247 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerDied","Data":"aa04e1af7cf6179fa3a4ce5dcaeaa923541127719e196bc60a7cd78115f9b457"} Nov 22 09:17:15 crc kubenswrapper[4846]: I1122 09:17:15.166857 4846 generic.go:334] "Generic (PLEG): container finished" podID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerID="f8843ae37042c98afbc71f3782171a798d9483a84e75e0812504fed73d50ffe7" exitCode=0 Nov 22 09:17:15 crc kubenswrapper[4846]: I1122 09:17:15.166941 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerDied","Data":"f8843ae37042c98afbc71f3782171a798d9483a84e75e0812504fed73d50ffe7"} Nov 22 09:17:15 crc kubenswrapper[4846]: I1122 09:17:15.170436 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerStarted","Data":"6d5975245388b44d660726017c614b92a237cac750c99aefb11fc3cfb8605100"} Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.190786 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerStarted","Data":"053ce6cfcfd104f680f3c8f115140e7a88239d03df25ddf27a043f02f7dd4cbf"} Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.194287 4846 generic.go:334] "Generic (PLEG): container finished" podID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerID="6d5975245388b44d660726017c614b92a237cac750c99aefb11fc3cfb8605100" exitCode=0 Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.194359 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerDied","Data":"6d5975245388b44d660726017c614b92a237cac750c99aefb11fc3cfb8605100"} Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.203650 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerStarted","Data":"7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725"} Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.208970 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerID="9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8" exitCode=0 Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.209082 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qplc2" event={"ID":"ee01bf13-f20b-4778-83bc-ccbb4fa78da4","Type":"ContainerDied","Data":"9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8"} Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.221885 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerStarted","Data":"73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840"} Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.224085 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rxk4c" podStartSLOduration=3.094267899 podStartE2EDuration="1m2.224067612s" podCreationTimestamp="2025-11-22 09:16:14 +0000 UTC" firstStartedPulling="2025-11-22 09:16:16.52159915 +0000 UTC m=+151.457288799" lastFinishedPulling="2025-11-22 09:17:15.651398863 +0000 UTC m=+210.587088512" observedRunningTime="2025-11-22 09:17:16.221937496 +0000 UTC m=+211.157627145" watchObservedRunningTime="2025-11-22 09:17:16.224067612 +0000 UTC m=+211.159757261" Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.299315 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwm66" podStartSLOduration=3.257616773 podStartE2EDuration="1m2.299282269s" podCreationTimestamp="2025-11-22 09:16:14 +0000 UTC" firstStartedPulling="2025-11-22 09:16:16.522082654 +0000 UTC m=+151.457772313" lastFinishedPulling="2025-11-22 09:17:15.56374816 +0000 UTC m=+210.499437809" observedRunningTime="2025-11-22 09:17:16.295931384 +0000 UTC m=+211.231621033" watchObservedRunningTime="2025-11-22 09:17:16.299282269 +0000 UTC m=+211.234971948" Nov 22 09:17:16 crc kubenswrapper[4846]: I1122 09:17:16.315424 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9cktz" podStartSLOduration=3.845209633 podStartE2EDuration="59.315401329s" podCreationTimestamp="2025-11-22 09:16:17 +0000 UTC" firstStartedPulling="2025-11-22 09:16:19.628788627 +0000 UTC m=+154.564478276" lastFinishedPulling="2025-11-22 09:17:15.098980323 +0000 UTC m=+210.034669972" observedRunningTime="2025-11-22 09:17:16.314864183 +0000 UTC m=+211.250553832" watchObservedRunningTime="2025-11-22 09:17:16.315401329 +0000 UTC m=+211.251090978" Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.032802 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.032878 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.082095 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.229859 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qplc2" event={"ID":"ee01bf13-f20b-4778-83bc-ccbb4fa78da4","Type":"ContainerStarted","Data":"2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2"} Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.253094 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qplc2" podStartSLOduration=2.16950366 podStartE2EDuration="1m1.253073785s" podCreationTimestamp="2025-11-22 09:16:16 +0000 UTC" firstStartedPulling="2025-11-22 09:16:17.530696738 +0000 UTC m=+152.466386387" lastFinishedPulling="2025-11-22 09:17:16.614266863 +0000 UTC m=+211.549956512" observedRunningTime="2025-11-22 09:17:17.249863525 +0000 UTC m=+212.185553174" watchObservedRunningTime="2025-11-22 09:17:17.253073785 +0000 UTC m=+212.188763434" Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.937595 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:17:17 crc kubenswrapper[4846]: I1122 09:17:17.937651 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:17:18 crc kubenswrapper[4846]: I1122 09:17:18.111917 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:17:18 crc kubenswrapper[4846]: I1122 09:17:18.112899 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:17:18 crc kubenswrapper[4846]: I1122 09:17:18.245999 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerStarted","Data":"4a6fd49ba054638e1afde2921a374769bd7d81c2f8930f755532160fee682c4a"} Nov 22 09:17:18 crc kubenswrapper[4846]: I1122 09:17:18.977801 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8mn4j" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="registry-server" probeResult="failure" output=< Nov 22 09:17:18 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:17:18 crc kubenswrapper[4846]: > Nov 22 09:17:19 crc kubenswrapper[4846]: I1122 09:17:19.163663 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9cktz" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="registry-server" probeResult="failure" output=< Nov 22 09:17:19 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:17:19 crc kubenswrapper[4846]: > Nov 22 09:17:24 crc kubenswrapper[4846]: I1122 09:17:24.492116 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:17:24 crc kubenswrapper[4846]: I1122 09:17:24.492801 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:17:24 crc kubenswrapper[4846]: I1122 09:17:24.536769 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:17:24 crc kubenswrapper[4846]: I1122 09:17:24.555855 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vxjp" podStartSLOduration=9.320998367 podStartE2EDuration="1m10.555830989s" podCreationTimestamp="2025-11-22 09:16:14 +0000 UTC" firstStartedPulling="2025-11-22 09:16:16.416073281 +0000 UTC m=+151.351762940" lastFinishedPulling="2025-11-22 09:17:17.650905913 +0000 UTC m=+212.586595562" observedRunningTime="2025-11-22 09:17:18.28402646 +0000 UTC m=+213.219716109" watchObservedRunningTime="2025-11-22 09:17:24.555830989 +0000 UTC m=+219.491520638" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.016357 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.016445 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.055880 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.099918 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.099970 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.140037 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.331137 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.333296 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:17:25 crc kubenswrapper[4846]: I1122 09:17:25.337228 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:17:26 crc kubenswrapper[4846]: I1122 09:17:26.171186 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vxjp"] Nov 22 09:17:26 crc kubenswrapper[4846]: I1122 09:17:26.637205 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:17:26 crc kubenswrapper[4846]: I1122 09:17:26.637262 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:17:26 crc kubenswrapper[4846]: I1122 09:17:26.680829 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:17:27 crc kubenswrapper[4846]: I1122 09:17:27.071479 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:17:27 crc kubenswrapper[4846]: I1122 09:17:27.300354 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vxjp" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="registry-server" containerID="cri-o://4a6fd49ba054638e1afde2921a374769bd7d81c2f8930f755532160fee682c4a" gracePeriod=2 Nov 22 09:17:27 crc kubenswrapper[4846]: I1122 09:17:27.342438 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:17:27 crc kubenswrapper[4846]: I1122 09:17:27.573622 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxk4c"] Nov 22 09:17:27 crc kubenswrapper[4846]: I1122 09:17:27.573995 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rxk4c" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="registry-server" containerID="cri-o://053ce6cfcfd104f680f3c8f115140e7a88239d03df25ddf27a043f02f7dd4cbf" gracePeriod=2 Nov 22 09:17:27 crc kubenswrapper[4846]: I1122 09:17:27.980558 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.032434 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.158166 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.200700 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.626316 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.626910 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.626981 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.627711 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:17:28 crc kubenswrapper[4846]: I1122 09:17:28.627829 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50" gracePeriod=600 Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.313872 4846 generic.go:334] "Generic (PLEG): container finished" podID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerID="053ce6cfcfd104f680f3c8f115140e7a88239d03df25ddf27a043f02f7dd4cbf" exitCode=0 Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.313991 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerDied","Data":"053ce6cfcfd104f680f3c8f115140e7a88239d03df25ddf27a043f02f7dd4cbf"} Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.317466 4846 generic.go:334] "Generic (PLEG): container finished" podID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerID="4a6fd49ba054638e1afde2921a374769bd7d81c2f8930f755532160fee682c4a" exitCode=0 Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.317540 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerDied","Data":"4a6fd49ba054638e1afde2921a374769bd7d81c2f8930f755532160fee682c4a"} Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.320173 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50" exitCode=0 Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.320230 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50"} Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.972564 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkgwl"] Nov 22 09:17:29 crc kubenswrapper[4846]: I1122 09:17:29.972956 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fkgwl" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="registry-server" containerID="cri-o://b915165b25430ccf239c8dff041a58803a8c94719136498a5e6b78e95213f3c9" gracePeriod=2 Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.341405 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxjp" event={"ID":"c642ad0c-4276-4e16-af01-7cb25b6eec61","Type":"ContainerDied","Data":"0fa2a99b7863ff12b9886fc5202b0e8cdfafd8390ce92bf34a015b5c78ebb845"} Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.341472 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fa2a99b7863ff12b9886fc5202b0e8cdfafd8390ce92bf34a015b5c78ebb845" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.343530 4846 generic.go:334] "Generic (PLEG): container finished" podID="37a85584-057d-4aa8-a753-20600a8f4bab" containerID="b915165b25430ccf239c8dff041a58803a8c94719136498a5e6b78e95213f3c9" exitCode=0 Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.343571 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkgwl" event={"ID":"37a85584-057d-4aa8-a753-20600a8f4bab","Type":"ContainerDied","Data":"b915165b25430ccf239c8dff041a58803a8c94719136498a5e6b78e95213f3c9"} Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.388223 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.452504 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.476626 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-utilities\") pod \"c642ad0c-4276-4e16-af01-7cb25b6eec61\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.476768 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d87fl\" (UniqueName: \"kubernetes.io/projected/c642ad0c-4276-4e16-af01-7cb25b6eec61-kube-api-access-d87fl\") pod \"c642ad0c-4276-4e16-af01-7cb25b6eec61\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.476828 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-catalog-content\") pod \"c642ad0c-4276-4e16-af01-7cb25b6eec61\" (UID: \"c642ad0c-4276-4e16-af01-7cb25b6eec61\") " Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.477910 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-utilities" (OuterVolumeSpecName: "utilities") pod "c642ad0c-4276-4e16-af01-7cb25b6eec61" (UID: "c642ad0c-4276-4e16-af01-7cb25b6eec61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.485567 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c642ad0c-4276-4e16-af01-7cb25b6eec61-kube-api-access-d87fl" (OuterVolumeSpecName: "kube-api-access-d87fl") pod "c642ad0c-4276-4e16-af01-7cb25b6eec61" (UID: "c642ad0c-4276-4e16-af01-7cb25b6eec61"). InnerVolumeSpecName "kube-api-access-d87fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.537408 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c642ad0c-4276-4e16-af01-7cb25b6eec61" (UID: "c642ad0c-4276-4e16-af01-7cb25b6eec61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.578579 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-catalog-content\") pod \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.578724 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-utilities\") pod \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.578826 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2hrd\" (UniqueName: \"kubernetes.io/projected/8b48034e-c5ec-4455-8e2b-f287119ee9aa-kube-api-access-x2hrd\") pod \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\" (UID: \"8b48034e-c5ec-4455-8e2b-f287119ee9aa\") " Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.579209 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d87fl\" (UniqueName: \"kubernetes.io/projected/c642ad0c-4276-4e16-af01-7cb25b6eec61-kube-api-access-d87fl\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.579238 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.579251 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c642ad0c-4276-4e16-af01-7cb25b6eec61-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.579826 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-utilities" (OuterVolumeSpecName: "utilities") pod "8b48034e-c5ec-4455-8e2b-f287119ee9aa" (UID: "8b48034e-c5ec-4455-8e2b-f287119ee9aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.587319 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b48034e-c5ec-4455-8e2b-f287119ee9aa-kube-api-access-x2hrd" (OuterVolumeSpecName: "kube-api-access-x2hrd") pod "8b48034e-c5ec-4455-8e2b-f287119ee9aa" (UID: "8b48034e-c5ec-4455-8e2b-f287119ee9aa"). InnerVolumeSpecName "kube-api-access-x2hrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.634791 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b48034e-c5ec-4455-8e2b-f287119ee9aa" (UID: "8b48034e-c5ec-4455-8e2b-f287119ee9aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.681120 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.681585 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2hrd\" (UniqueName: \"kubernetes.io/projected/8b48034e-c5ec-4455-8e2b-f287119ee9aa-kube-api-access-x2hrd\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:30 crc kubenswrapper[4846]: I1122 09:17:30.681666 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b48034e-c5ec-4455-8e2b-f287119ee9aa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.353208 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"e9d1e242bde74884effedf6ed226573341c9a217ba1fc454c2f0c977522434d6"} Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.357638 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxjp" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.357700 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxk4c" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.357745 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk4c" event={"ID":"8b48034e-c5ec-4455-8e2b-f287119ee9aa","Type":"ContainerDied","Data":"c82803f03abe3b3f19946548201b45a5e0ce12cd87418130f3cecf6c938d091b"} Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.357787 4846 scope.go:117] "RemoveContainer" containerID="053ce6cfcfd104f680f3c8f115140e7a88239d03df25ddf27a043f02f7dd4cbf" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.380365 4846 scope.go:117] "RemoveContainer" containerID="f8843ae37042c98afbc71f3782171a798d9483a84e75e0812504fed73d50ffe7" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.410269 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxk4c"] Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.422304 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rxk4c"] Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.427686 4846 scope.go:117] "RemoveContainer" containerID="b84c1ea5a52b237ad064e938798f2be8ec8f62f481cd8fa62917683a39569c4e" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.432462 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vxjp"] Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.452369 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vxjp"] Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.527975 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.698810 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf5js\" (UniqueName: \"kubernetes.io/projected/37a85584-057d-4aa8-a753-20600a8f4bab-kube-api-access-mf5js\") pod \"37a85584-057d-4aa8-a753-20600a8f4bab\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.698982 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-utilities\") pod \"37a85584-057d-4aa8-a753-20600a8f4bab\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.699015 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-catalog-content\") pod \"37a85584-057d-4aa8-a753-20600a8f4bab\" (UID: \"37a85584-057d-4aa8-a753-20600a8f4bab\") " Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.704888 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-utilities" (OuterVolumeSpecName: "utilities") pod "37a85584-057d-4aa8-a753-20600a8f4bab" (UID: "37a85584-057d-4aa8-a753-20600a8f4bab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.710724 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a85584-057d-4aa8-a753-20600a8f4bab-kube-api-access-mf5js" (OuterVolumeSpecName: "kube-api-access-mf5js") pod "37a85584-057d-4aa8-a753-20600a8f4bab" (UID: "37a85584-057d-4aa8-a753-20600a8f4bab"). InnerVolumeSpecName "kube-api-access-mf5js". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.723105 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37a85584-057d-4aa8-a753-20600a8f4bab" (UID: "37a85584-057d-4aa8-a753-20600a8f4bab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.800805 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.800853 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a85584-057d-4aa8-a753-20600a8f4bab-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:31 crc kubenswrapper[4846]: I1122 09:17:31.800865 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf5js\" (UniqueName: \"kubernetes.io/projected/37a85584-057d-4aa8-a753-20600a8f4bab-kube-api-access-mf5js\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.045888 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" path="/var/lib/kubelet/pods/8b48034e-c5ec-4455-8e2b-f287119ee9aa/volumes" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.046789 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" path="/var/lib/kubelet/pods/c642ad0c-4276-4e16-af01-7cb25b6eec61/volumes" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.368102 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkgwl" event={"ID":"37a85584-057d-4aa8-a753-20600a8f4bab","Type":"ContainerDied","Data":"eac658e78d7c874db0c7741ecbeb6108cee98e763dcfbb57e30a0ee567dacb99"} Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.368520 4846 scope.go:117] "RemoveContainer" containerID="b915165b25430ccf239c8dff041a58803a8c94719136498a5e6b78e95213f3c9" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.368536 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkgwl" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.374774 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9cktz"] Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.375277 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9cktz" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="registry-server" containerID="cri-o://73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840" gracePeriod=2 Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.400649 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkgwl"] Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.401240 4846 scope.go:117] "RemoveContainer" containerID="f7fc8e50b762894e80ff0d3085533748af9680f24c9269777b722ae928c31d18" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.404453 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkgwl"] Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.415387 4846 scope.go:117] "RemoveContainer" containerID="a0c747a19d6c982bfb57a079491a8ce3ddb5d85db641d23ae09026e08f1b0ede" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.717255 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.816374 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-catalog-content\") pod \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.816473 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg9dx\" (UniqueName: \"kubernetes.io/projected/45f10022-ca14-42a8-bb6f-e28b7df2da4e-kube-api-access-wg9dx\") pod \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.816533 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-utilities\") pod \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\" (UID: \"45f10022-ca14-42a8-bb6f-e28b7df2da4e\") " Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.817461 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-utilities" (OuterVolumeSpecName: "utilities") pod "45f10022-ca14-42a8-bb6f-e28b7df2da4e" (UID: "45f10022-ca14-42a8-bb6f-e28b7df2da4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.825505 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f10022-ca14-42a8-bb6f-e28b7df2da4e-kube-api-access-wg9dx" (OuterVolumeSpecName: "kube-api-access-wg9dx") pod "45f10022-ca14-42a8-bb6f-e28b7df2da4e" (UID: "45f10022-ca14-42a8-bb6f-e28b7df2da4e"). InnerVolumeSpecName "kube-api-access-wg9dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.913799 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45f10022-ca14-42a8-bb6f-e28b7df2da4e" (UID: "45f10022-ca14-42a8-bb6f-e28b7df2da4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.918766 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.918795 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg9dx\" (UniqueName: \"kubernetes.io/projected/45f10022-ca14-42a8-bb6f-e28b7df2da4e-kube-api-access-wg9dx\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:32 crc kubenswrapper[4846]: I1122 09:17:32.918806 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45f10022-ca14-42a8-bb6f-e28b7df2da4e-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.378917 4846 generic.go:334] "Generic (PLEG): container finished" podID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerID="73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840" exitCode=0 Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.378983 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9cktz" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.379017 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerDied","Data":"73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840"} Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.380138 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9cktz" event={"ID":"45f10022-ca14-42a8-bb6f-e28b7df2da4e","Type":"ContainerDied","Data":"9f9ca63eb3a9b34daffe23327dc18c4175e2837d5c9d6d8827fcaf69678ebdfb"} Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.380229 4846 scope.go:117] "RemoveContainer" containerID="73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.400232 4846 scope.go:117] "RemoveContainer" containerID="d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.422943 4846 scope.go:117] "RemoveContainer" containerID="d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.427279 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9cktz"] Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.447137 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9cktz"] Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.458283 4846 scope.go:117] "RemoveContainer" containerID="73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840" Nov 22 09:17:33 crc kubenswrapper[4846]: E1122 09:17:33.458824 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840\": container with ID starting with 73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840 not found: ID does not exist" containerID="73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.458871 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840"} err="failed to get container status \"73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840\": rpc error: code = NotFound desc = could not find container \"73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840\": container with ID starting with 73e45ec9d9278124bff68c262646c2c63fa5e3c21abe8a7ec9eae5bd7b02f840 not found: ID does not exist" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.458907 4846 scope.go:117] "RemoveContainer" containerID="d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba" Nov 22 09:17:33 crc kubenswrapper[4846]: E1122 09:17:33.459385 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba\": container with ID starting with d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba not found: ID does not exist" containerID="d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.459439 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba"} err="failed to get container status \"d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba\": rpc error: code = NotFound desc = could not find container \"d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba\": container with ID starting with d56492636c57c8c370b330e1bd176e9228ca332a7d97417009852c4f9eca10ba not found: ID does not exist" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.459476 4846 scope.go:117] "RemoveContainer" containerID="d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497" Nov 22 09:17:33 crc kubenswrapper[4846]: E1122 09:17:33.459816 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497\": container with ID starting with d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497 not found: ID does not exist" containerID="d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497" Nov 22 09:17:33 crc kubenswrapper[4846]: I1122 09:17:33.459851 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497"} err="failed to get container status \"d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497\": rpc error: code = NotFound desc = could not find container \"d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497\": container with ID starting with d2eae0da3dd50567c0281312c0ba201f3cf471aa61de1272a138e6e4af3be497 not found: ID does not exist" Nov 22 09:17:34 crc kubenswrapper[4846]: I1122 09:17:34.042241 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" path="/var/lib/kubelet/pods/37a85584-057d-4aa8-a753-20600a8f4bab/volumes" Nov 22 09:17:34 crc kubenswrapper[4846]: I1122 09:17:34.042917 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" path="/var/lib/kubelet/pods/45f10022-ca14-42a8-bb6f-e28b7df2da4e/volumes" Nov 22 09:17:36 crc kubenswrapper[4846]: I1122 09:17:36.746957 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2fp62"] Nov 22 09:18:01 crc kubenswrapper[4846]: I1122 09:18:01.786077 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" podUID="4b0be692-d108-4051-9a33-6529b4ed1e7b" containerName="oauth-openshift" containerID="cri-o://3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5" gracePeriod=15 Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.148946 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189036 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r"] Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189331 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf" containerName="pruner" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189348 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf" containerName="pruner" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189360 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189368 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189388 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189395 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189402 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189407 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189414 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0be692-d108-4051-9a33-6529b4ed1e7b" containerName="oauth-openshift" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189443 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0be692-d108-4051-9a33-6529b4ed1e7b" containerName="oauth-openshift" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189452 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189458 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189467 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189474 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189484 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189490 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189503 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189510 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189522 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189528 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="extract-utilities" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189535 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189543 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189554 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189560 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189568 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189573 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.189582 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189589 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="extract-content" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189718 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c642ad0c-4276-4e16-af01-7cb25b6eec61" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189732 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0be692-d108-4051-9a33-6529b4ed1e7b" containerName="oauth-openshift" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189744 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaccbbd-f2ed-41f4-9e86-a2a68f7c53bf" containerName="pruner" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189751 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b48034e-c5ec-4455-8e2b-f287119ee9aa" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189759 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a85584-057d-4aa8-a753-20600a8f4bab" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.189765 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f10022-ca14-42a8-bb6f-e28b7df2da4e" containerName="registry-server" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.190220 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.199924 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r"] Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255087 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-session\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255153 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-provider-selection\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255192 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-service-ca\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255244 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-serving-cert\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255277 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-cliconfig\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255316 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-router-certs\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255342 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-policies\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255364 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-error\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255411 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlfvd\" (UniqueName: \"kubernetes.io/projected/4b0be692-d108-4051-9a33-6529b4ed1e7b-kube-api-access-wlfvd\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255905 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-trusted-ca-bundle\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.255967 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-login\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256013 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-ocp-branding-template\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256074 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-idp-0-file-data\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256099 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-dir\") pod \"4b0be692-d108-4051-9a33-6529b4ed1e7b\" (UID: \"4b0be692-d108-4051-9a33-6529b4ed1e7b\") " Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256220 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256264 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256293 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256287 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256317 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256346 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256360 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256375 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256399 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256298 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256458 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256493 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256528 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7nh\" (UniqueName: \"kubernetes.io/projected/eac3abdb-423c-4067-8f62-d45f8568223d-kube-api-access-fz7nh\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256534 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256630 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eac3abdb-423c-4067-8f62-d45f8568223d-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256662 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256700 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256724 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256775 4846 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256793 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256806 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256819 4846 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.256890 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.265333 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.265928 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.266237 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0be692-d108-4051-9a33-6529b4ed1e7b-kube-api-access-wlfvd" (OuterVolumeSpecName: "kube-api-access-wlfvd") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "kube-api-access-wlfvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.266244 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.266608 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.266615 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.266848 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.267079 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.272652 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4b0be692-d108-4051-9a33-6529b4ed1e7b" (UID: "4b0be692-d108-4051-9a33-6529b4ed1e7b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357712 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357791 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357814 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357830 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357861 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357898 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357919 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357943 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357966 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.357995 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7nh\" (UniqueName: \"kubernetes.io/projected/eac3abdb-423c-4067-8f62-d45f8568223d-kube-api-access-fz7nh\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358016 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358036 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eac3abdb-423c-4067-8f62-d45f8568223d-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358080 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358100 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358151 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358163 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358173 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358183 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358193 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358203 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358215 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358224 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358234 4846 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b0be692-d108-4051-9a33-6529b4ed1e7b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.358245 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlfvd\" (UniqueName: \"kubernetes.io/projected/4b0be692-d108-4051-9a33-6529b4ed1e7b-kube-api-access-wlfvd\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.359091 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eac3abdb-423c-4067-8f62-d45f8568223d-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.359318 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.359328 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.359342 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.361170 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.361299 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.361719 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.361858 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.361904 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.362609 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.363485 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.363671 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.364661 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eac3abdb-423c-4067-8f62-d45f8568223d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.373924 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7nh\" (UniqueName: \"kubernetes.io/projected/eac3abdb-423c-4067-8f62-d45f8568223d-kube-api-access-fz7nh\") pod \"oauth-openshift-6bbf4c9fdf-w2h4r\" (UID: \"eac3abdb-423c-4067-8f62-d45f8568223d\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.520927 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.555277 4846 generic.go:334] "Generic (PLEG): container finished" podID="4b0be692-d108-4051-9a33-6529b4ed1e7b" containerID="3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5" exitCode=0 Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.555347 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" event={"ID":"4b0be692-d108-4051-9a33-6529b4ed1e7b","Type":"ContainerDied","Data":"3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5"} Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.555386 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" event={"ID":"4b0be692-d108-4051-9a33-6529b4ed1e7b","Type":"ContainerDied","Data":"a6075d82d81e46cc7556a7aff4878293f5685202e5175f933d74257a98507ffe"} Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.555409 4846 scope.go:117] "RemoveContainer" containerID="3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.555349 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2fp62" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.577468 4846 scope.go:117] "RemoveContainer" containerID="3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5" Nov 22 09:18:02 crc kubenswrapper[4846]: E1122 09:18:02.581754 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5\": container with ID starting with 3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5 not found: ID does not exist" containerID="3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.581813 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5"} err="failed to get container status \"3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5\": rpc error: code = NotFound desc = could not find container \"3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5\": container with ID starting with 3fec90deb4b5dee8bde5f56efae9a78364a2397d40b94e0764cd72a6b689c8f5 not found: ID does not exist" Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.591628 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2fp62"] Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.601579 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2fp62"] Nov 22 09:18:02 crc kubenswrapper[4846]: I1122 09:18:02.988285 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r"] Nov 22 09:18:03 crc kubenswrapper[4846]: I1122 09:18:03.563725 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" event={"ID":"eac3abdb-423c-4067-8f62-d45f8568223d","Type":"ContainerStarted","Data":"81ea45426de017740987cab3125f3ff219d16a17b1272eb19819135fb9f389ee"} Nov 22 09:18:03 crc kubenswrapper[4846]: I1122 09:18:03.563784 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" event={"ID":"eac3abdb-423c-4067-8f62-d45f8568223d","Type":"ContainerStarted","Data":"5907af8e8b8aabcf378fc5f1f25c241393abbe651c0a3fb22e115b7d3bff995d"} Nov 22 09:18:03 crc kubenswrapper[4846]: I1122 09:18:03.563930 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:03 crc kubenswrapper[4846]: I1122 09:18:03.587102 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" podStartSLOduration=27.587073446 podStartE2EDuration="27.587073446s" podCreationTimestamp="2025-11-22 09:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:18:03.584068262 +0000 UTC m=+258.519757921" watchObservedRunningTime="2025-11-22 09:18:03.587073446 +0000 UTC m=+258.522763095" Nov 22 09:18:03 crc kubenswrapper[4846]: I1122 09:18:03.606056 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-w2h4r" Nov 22 09:18:04 crc kubenswrapper[4846]: I1122 09:18:04.042255 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0be692-d108-4051-9a33-6529b4ed1e7b" path="/var/lib/kubelet/pods/4b0be692-d108-4051-9a33-6529b4ed1e7b/volumes" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.471883 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrd7k"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.475283 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rrd7k" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="registry-server" containerID="cri-o://498b8d378e718bb4066a67aa10277ce33ab4ac612dec67f38b1f7208a5cee211" gracePeriod=30 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.488931 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwm66"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.489314 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwm66" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="registry-server" containerID="cri-o://7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725" gracePeriod=30 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.502815 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hmwp7"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.503109 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" containerID="cri-o://81c67e3f1d03d80f0b87aa0de08e7a8e1c81ba755756e0a6bfc0276753564f4c" gracePeriod=30 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.509752 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qplc2"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.510029 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qplc2" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="registry-server" containerID="cri-o://2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2" gracePeriod=30 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.519712 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2lfzs"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.520524 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.525815 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8mn4j"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.526107 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8mn4j" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="registry-server" containerID="cri-o://1111e8627056968b78bdb076952ee80fe1503376ed228b810aa9143bef239697" gracePeriod=30 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.548879 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2lfzs"] Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.652124 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ls22\" (UniqueName: \"kubernetes.io/projected/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-kube-api-access-6ls22\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.652384 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.652522 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.688549 4846 generic.go:334] "Generic (PLEG): container finished" podID="2faa496d-af10-40eb-984b-1a67af462dbf" containerID="498b8d378e718bb4066a67aa10277ce33ab4ac612dec67f38b1f7208a5cee211" exitCode=0 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.688637 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerDied","Data":"498b8d378e718bb4066a67aa10277ce33ab4ac612dec67f38b1f7208a5cee211"} Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.691792 4846 generic.go:334] "Generic (PLEG): container finished" podID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerID="7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725" exitCode=0 Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.691849 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerDied","Data":"7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725"} Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.753660 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.753744 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.753779 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ls22\" (UniqueName: \"kubernetes.io/projected/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-kube-api-access-6ls22\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.755537 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.759546 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:23 crc kubenswrapper[4846]: I1122 09:18:23.770039 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ls22\" (UniqueName: \"kubernetes.io/projected/b2d91bbe-e29e-4a12-a7a8-92c26c4a977b-kube-api-access-6ls22\") pod \"marketplace-operator-79b997595-2lfzs\" (UID: \"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.025868 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.065261 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.159590 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-utilities\") pod \"2faa496d-af10-40eb-984b-1a67af462dbf\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.160315 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mx6c\" (UniqueName: \"kubernetes.io/projected/2faa496d-af10-40eb-984b-1a67af462dbf-kube-api-access-6mx6c\") pod \"2faa496d-af10-40eb-984b-1a67af462dbf\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.160420 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-catalog-content\") pod \"2faa496d-af10-40eb-984b-1a67af462dbf\" (UID: \"2faa496d-af10-40eb-984b-1a67af462dbf\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.161352 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-utilities" (OuterVolumeSpecName: "utilities") pod "2faa496d-af10-40eb-984b-1a67af462dbf" (UID: "2faa496d-af10-40eb-984b-1a67af462dbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.161933 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.169415 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faa496d-af10-40eb-984b-1a67af462dbf-kube-api-access-6mx6c" (OuterVolumeSpecName: "kube-api-access-6mx6c") pod "2faa496d-af10-40eb-984b-1a67af462dbf" (UID: "2faa496d-af10-40eb-984b-1a67af462dbf"). InnerVolumeSpecName "kube-api-access-6mx6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.217950 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2faa496d-af10-40eb-984b-1a67af462dbf" (UID: "2faa496d-af10-40eb-984b-1a67af462dbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.263463 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mx6c\" (UniqueName: \"kubernetes.io/projected/2faa496d-af10-40eb-984b-1a67af462dbf-kube-api-access-6mx6c\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.263518 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2faa496d-af10-40eb-984b-1a67af462dbf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.445433 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2lfzs"] Nov 22 09:18:24 crc kubenswrapper[4846]: W1122 09:18:24.461370 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d91bbe_e29e_4a12_a7a8_92c26c4a977b.slice/crio-15c110b62a670028755285c3ae47e1e5242df5019aacb07974ebbe10cd5bb225 WatchSource:0}: Error finding container 15c110b62a670028755285c3ae47e1e5242df5019aacb07974ebbe10cd5bb225: Status 404 returned error can't find the container with id 15c110b62a670028755285c3ae47e1e5242df5019aacb07974ebbe10cd5bb225 Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.492270 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725 is running failed: container process not found" containerID="7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.492577 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725 is running failed: container process not found" containerID="7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.493593 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725 is running failed: container process not found" containerID="7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.493635 4846 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-zwm66" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="registry-server" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.703233 4846 generic.go:334] "Generic (PLEG): container finished" podID="750ea675-e79a-459b-8261-e15dd252a8f1" containerID="81c67e3f1d03d80f0b87aa0de08e7a8e1c81ba755756e0a6bfc0276753564f4c" exitCode=0 Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.703296 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" event={"ID":"750ea675-e79a-459b-8261-e15dd252a8f1","Type":"ContainerDied","Data":"81c67e3f1d03d80f0b87aa0de08e7a8e1c81ba755756e0a6bfc0276753564f4c"} Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.714292 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.732608 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerID="2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2" exitCode=0 Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.732699 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qplc2" event={"ID":"ee01bf13-f20b-4778-83bc-ccbb4fa78da4","Type":"ContainerDied","Data":"2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2"} Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.732774 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qplc2" event={"ID":"ee01bf13-f20b-4778-83bc-ccbb4fa78da4","Type":"ContainerDied","Data":"555a7776bfb9c5e7a1872cd596d31f311f06b2b289d0f04b1bacef1ed233b0a1"} Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.732799 4846 scope.go:117] "RemoveContainer" containerID="2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.741594 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rrd7k" event={"ID":"2faa496d-af10-40eb-984b-1a67af462dbf","Type":"ContainerDied","Data":"c22da52be887008f7d93d1dcc8ff897300dc733f24d02dbfd9245c002c7ef774"} Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.741742 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rrd7k" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.763119 4846 generic.go:334] "Generic (PLEG): container finished" podID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerID="1111e8627056968b78bdb076952ee80fe1503376ed228b810aa9143bef239697" exitCode=0 Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.763414 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerDied","Data":"1111e8627056968b78bdb076952ee80fe1503376ed228b810aa9143bef239697"} Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.765866 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" event={"ID":"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b","Type":"ContainerStarted","Data":"15c110b62a670028755285c3ae47e1e5242df5019aacb07974ebbe10cd5bb225"} Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.812573 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rrd7k"] Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.814491 4846 scope.go:117] "RemoveContainer" containerID="9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.816714 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rrd7k"] Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.864163 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.864684 4846 scope.go:117] "RemoveContainer" containerID="040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.872947 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-catalog-content\") pod \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.873004 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-utilities\") pod \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.873089 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-kube-api-access-t2llq\") pod \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\" (UID: \"ee01bf13-f20b-4778-83bc-ccbb4fa78da4\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.875646 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.875917 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-utilities" (OuterVolumeSpecName: "utilities") pod "ee01bf13-f20b-4778-83bc-ccbb4fa78da4" (UID: "ee01bf13-f20b-4778-83bc-ccbb4fa78da4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.882392 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-kube-api-access-t2llq" (OuterVolumeSpecName: "kube-api-access-t2llq") pod "ee01bf13-f20b-4778-83bc-ccbb4fa78da4" (UID: "ee01bf13-f20b-4778-83bc-ccbb4fa78da4"). InnerVolumeSpecName "kube-api-access-t2llq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.890223 4846 scope.go:117] "RemoveContainer" containerID="2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2" Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.894325 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2\": container with ID starting with 2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2 not found: ID does not exist" containerID="2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.894558 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2"} err="failed to get container status \"2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2\": rpc error: code = NotFound desc = could not find container \"2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2\": container with ID starting with 2ef199142f7e5b6fbd6fd8c20e9a71b336e1137bbb55aba8fd590cb0874aeeb2 not found: ID does not exist" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.894711 4846 scope.go:117] "RemoveContainer" containerID="9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8" Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.897790 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8\": container with ID starting with 9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8 not found: ID does not exist" containerID="9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.897854 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8"} err="failed to get container status \"9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8\": rpc error: code = NotFound desc = could not find container \"9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8\": container with ID starting with 9b9e86fac0315bed3ac229aa789b984d03c52a8e320dd63f8075e9d244ae71c8 not found: ID does not exist" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.897899 4846 scope.go:117] "RemoveContainer" containerID="040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af" Nov 22 09:18:24 crc kubenswrapper[4846]: E1122 09:18:24.899277 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af\": container with ID starting with 040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af not found: ID does not exist" containerID="040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.899435 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af"} err="failed to get container status \"040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af\": rpc error: code = NotFound desc = could not find container \"040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af\": container with ID starting with 040a4629be760ca6cd0549f2147e85cba92fdbf631a5535d925dc8c057a4b0af not found: ID does not exist" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.899589 4846 scope.go:117] "RemoveContainer" containerID="498b8d378e718bb4066a67aa10277ce33ab4ac612dec67f38b1f7208a5cee211" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.901177 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.902020 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee01bf13-f20b-4778-83bc-ccbb4fa78da4" (UID: "ee01bf13-f20b-4778-83bc-ccbb4fa78da4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.925251 4846 scope.go:117] "RemoveContainer" containerID="4fe418646a389460a345c63661a36b27f976c7fedff34b0cb19bb9ae9e725c64" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.953361 4846 scope.go:117] "RemoveContainer" containerID="b1ef4cc77ac4663d8c91a9b5ac841855bccb4a5051969a34c80ea88dc1ef5f9c" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.975549 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-trusted-ca\") pod \"750ea675-e79a-459b-8261-e15dd252a8f1\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.975608 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4pwt\" (UniqueName: \"kubernetes.io/projected/750ea675-e79a-459b-8261-e15dd252a8f1-kube-api-access-s4pwt\") pod \"750ea675-e79a-459b-8261-e15dd252a8f1\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.975663 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-operator-metrics\") pod \"750ea675-e79a-459b-8261-e15dd252a8f1\" (UID: \"750ea675-e79a-459b-8261-e15dd252a8f1\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.975712 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-catalog-content\") pod \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.975744 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-utilities\") pod \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.975784 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbptv\" (UniqueName: \"kubernetes.io/projected/1a7ef86c-1179-422c-90b7-c2e24e5687e9-kube-api-access-nbptv\") pod \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\" (UID: \"1a7ef86c-1179-422c-90b7-c2e24e5687e9\") " Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.976004 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.976015 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.976026 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/ee01bf13-f20b-4778-83bc-ccbb4fa78da4-kube-api-access-t2llq\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.979571 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a7ef86c-1179-422c-90b7-c2e24e5687e9-kube-api-access-nbptv" (OuterVolumeSpecName: "kube-api-access-nbptv") pod "1a7ef86c-1179-422c-90b7-c2e24e5687e9" (UID: "1a7ef86c-1179-422c-90b7-c2e24e5687e9"). InnerVolumeSpecName "kube-api-access-nbptv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.979582 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "750ea675-e79a-459b-8261-e15dd252a8f1" (UID: "750ea675-e79a-459b-8261-e15dd252a8f1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.979861 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "750ea675-e79a-459b-8261-e15dd252a8f1" (UID: "750ea675-e79a-459b-8261-e15dd252a8f1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.979947 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-utilities" (OuterVolumeSpecName: "utilities") pod "1a7ef86c-1179-422c-90b7-c2e24e5687e9" (UID: "1a7ef86c-1179-422c-90b7-c2e24e5687e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:24 crc kubenswrapper[4846]: I1122 09:18:24.983081 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750ea675-e79a-459b-8261-e15dd252a8f1-kube-api-access-s4pwt" (OuterVolumeSpecName: "kube-api-access-s4pwt") pod "750ea675-e79a-459b-8261-e15dd252a8f1" (UID: "750ea675-e79a-459b-8261-e15dd252a8f1"). InnerVolumeSpecName "kube-api-access-s4pwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.050813 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a7ef86c-1179-422c-90b7-c2e24e5687e9" (UID: "1a7ef86c-1179-422c-90b7-c2e24e5687e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.076986 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-utilities\") pod \"c0da74d6-4145-4e5d-ac01-a5df2289d427\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.077440 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lg8g\" (UniqueName: \"kubernetes.io/projected/c0da74d6-4145-4e5d-ac01-a5df2289d427-kube-api-access-2lg8g\") pod \"c0da74d6-4145-4e5d-ac01-a5df2289d427\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.077606 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-catalog-content\") pod \"c0da74d6-4145-4e5d-ac01-a5df2289d427\" (UID: \"c0da74d6-4145-4e5d-ac01-a5df2289d427\") " Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.077977 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.078096 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef86c-1179-422c-90b7-c2e24e5687e9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.078204 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbptv\" (UniqueName: \"kubernetes.io/projected/1a7ef86c-1179-422c-90b7-c2e24e5687e9-kube-api-access-nbptv\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.077968 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-utilities" (OuterVolumeSpecName: "utilities") pod "c0da74d6-4145-4e5d-ac01-a5df2289d427" (UID: "c0da74d6-4145-4e5d-ac01-a5df2289d427"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.078294 4846 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.078395 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4pwt\" (UniqueName: \"kubernetes.io/projected/750ea675-e79a-459b-8261-e15dd252a8f1-kube-api-access-s4pwt\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.078419 4846 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/750ea675-e79a-459b-8261-e15dd252a8f1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.079939 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0da74d6-4145-4e5d-ac01-a5df2289d427-kube-api-access-2lg8g" (OuterVolumeSpecName: "kube-api-access-2lg8g") pod "c0da74d6-4145-4e5d-ac01-a5df2289d427" (UID: "c0da74d6-4145-4e5d-ac01-a5df2289d427"). InnerVolumeSpecName "kube-api-access-2lg8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.169648 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0da74d6-4145-4e5d-ac01-a5df2289d427" (UID: "c0da74d6-4145-4e5d-ac01-a5df2289d427"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.179508 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.179545 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0da74d6-4145-4e5d-ac01-a5df2289d427-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.179557 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lg8g\" (UniqueName: \"kubernetes.io/projected/c0da74d6-4145-4e5d-ac01-a5df2289d427-kube-api-access-2lg8g\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.688069 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8xw7p"] Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.689519 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.689612 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.689703 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.689778 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.689869 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.689956 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.690038 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.690136 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.690215 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.690285 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.690370 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.690458 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="extract-utilities" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.690540 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.690614 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.690697 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.690779 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.690863 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.690939 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.691033 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691145 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.691224 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691301 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="extract-content" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.691377 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691446 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: E1122 09:18:25.691525 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691614 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691804 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691895 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" containerName="marketplace-operator" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.691973 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.692067 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.692147 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" containerName="registry-server" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.693250 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.698217 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.701876 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xw7p"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.772353 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" event={"ID":"b2d91bbe-e29e-4a12-a7a8-92c26c4a977b","Type":"ContainerStarted","Data":"cdbe51f10fa3abc741cb95be1d56740755f662d479f2b3f1605698a506637376"} Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.772804 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.774265 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.774381 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hmwp7" event={"ID":"750ea675-e79a-459b-8261-e15dd252a8f1","Type":"ContainerDied","Data":"453b4a338b3b8d4c3e80a93e8c94ca0d6fb5fd67938cf0728c192a1de760639d"} Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.774502 4846 scope.go:117] "RemoveContainer" containerID="81c67e3f1d03d80f0b87aa0de08e7a8e1c81ba755756e0a6bfc0276753564f4c" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.777080 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwm66" event={"ID":"1a7ef86c-1179-422c-90b7-c2e24e5687e9","Type":"ContainerDied","Data":"0f1ebbd62ba8395b839d2aac068e2733e6520cf00c554fd2eb3238d1c19c817d"} Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.777109 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwm66" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.779319 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qplc2" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.780300 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.784817 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8mn4j" event={"ID":"c0da74d6-4145-4e5d-ac01-a5df2289d427","Type":"ContainerDied","Data":"7a96238c807168d88b8ef041ef7b4f3b9c7278e2d35668064cffea1340d59a3e"} Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.784917 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8mn4j" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.787536 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edf4077-fb2b-42e9-8fb4-089d14519da9-catalog-content\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.787605 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhddp\" (UniqueName: \"kubernetes.io/projected/9edf4077-fb2b-42e9-8fb4-089d14519da9-kube-api-access-fhddp\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.787630 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edf4077-fb2b-42e9-8fb4-089d14519da9-utilities\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.796726 4846 scope.go:117] "RemoveContainer" containerID="7b48ba15c6bcc1b44a0ae7d40a9aae2844d9b229fada705db075632c95dad725" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.797642 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2lfzs" podStartSLOduration=2.797619063 podStartE2EDuration="2.797619063s" podCreationTimestamp="2025-11-22 09:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:18:25.791701949 +0000 UTC m=+280.727391608" watchObservedRunningTime="2025-11-22 09:18:25.797619063 +0000 UTC m=+280.733308712" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.819547 4846 scope.go:117] "RemoveContainer" containerID="aa04e1af7cf6179fa3a4ce5dcaeaa923541127719e196bc60a7cd78115f9b457" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.909809 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwm66"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.910229 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edf4077-fb2b-42e9-8fb4-089d14519da9-catalog-content\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.910354 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhddp\" (UniqueName: \"kubernetes.io/projected/9edf4077-fb2b-42e9-8fb4-089d14519da9-kube-api-access-fhddp\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.910392 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edf4077-fb2b-42e9-8fb4-089d14519da9-utilities\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.911226 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9edf4077-fb2b-42e9-8fb4-089d14519da9-utilities\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.911465 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9edf4077-fb2b-42e9-8fb4-089d14519da9-catalog-content\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.915871 4846 scope.go:117] "RemoveContainer" containerID="53f24d5605834a9b4de54e274362ad367ef1131956f9e21e51b23ce5a0fe5b82" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.932614 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwm66"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.935918 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhddp\" (UniqueName: \"kubernetes.io/projected/9edf4077-fb2b-42e9-8fb4-089d14519da9-kube-api-access-fhddp\") pod \"certified-operators-8xw7p\" (UID: \"9edf4077-fb2b-42e9-8fb4-089d14519da9\") " pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.936224 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hmwp7"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.939886 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hmwp7"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.941922 4846 scope.go:117] "RemoveContainer" containerID="1111e8627056968b78bdb076952ee80fe1503376ed228b810aa9143bef239697" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.946969 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8mn4j"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.958134 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8mn4j"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.961873 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qplc2"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.966469 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qplc2"] Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.974589 4846 scope.go:117] "RemoveContainer" containerID="bb6b6b96db00b419ce5d373482d9cdd8c611f6f241b4cb73cbd2022750b803b1" Nov 22 09:18:25 crc kubenswrapper[4846]: I1122 09:18:25.990986 4846 scope.go:117] "RemoveContainer" containerID="e1d37d317cbef7a42b928be510127bdc51103001b4bd3830b4f6602ec97bcaa6" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.015705 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.043812 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a7ef86c-1179-422c-90b7-c2e24e5687e9" path="/var/lib/kubelet/pods/1a7ef86c-1179-422c-90b7-c2e24e5687e9/volumes" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.044736 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faa496d-af10-40eb-984b-1a67af462dbf" path="/var/lib/kubelet/pods/2faa496d-af10-40eb-984b-1a67af462dbf/volumes" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.045385 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750ea675-e79a-459b-8261-e15dd252a8f1" path="/var/lib/kubelet/pods/750ea675-e79a-459b-8261-e15dd252a8f1/volumes" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.046267 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0da74d6-4145-4e5d-ac01-a5df2289d427" path="/var/lib/kubelet/pods/c0da74d6-4145-4e5d-ac01-a5df2289d427/volumes" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.046836 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee01bf13-f20b-4778-83bc-ccbb4fa78da4" path="/var/lib/kubelet/pods/ee01bf13-f20b-4778-83bc-ccbb4fa78da4/volumes" Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.200759 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xw7p"] Nov 22 09:18:26 crc kubenswrapper[4846]: W1122 09:18:26.205299 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9edf4077_fb2b_42e9_8fb4_089d14519da9.slice/crio-87432ec1c5050c92532e64c7f952a5a3f02c6283fb7b0f6ebaae528983d3e228 WatchSource:0}: Error finding container 87432ec1c5050c92532e64c7f952a5a3f02c6283fb7b0f6ebaae528983d3e228: Status 404 returned error can't find the container with id 87432ec1c5050c92532e64c7f952a5a3f02c6283fb7b0f6ebaae528983d3e228 Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.793944 4846 generic.go:334] "Generic (PLEG): container finished" podID="9edf4077-fb2b-42e9-8fb4-089d14519da9" containerID="3e0a435d3e21d44c15a90f8dcdda98fcb2a85626c4352f434f24788be9901e07" exitCode=0 Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.794260 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xw7p" event={"ID":"9edf4077-fb2b-42e9-8fb4-089d14519da9","Type":"ContainerDied","Data":"3e0a435d3e21d44c15a90f8dcdda98fcb2a85626c4352f434f24788be9901e07"} Nov 22 09:18:26 crc kubenswrapper[4846]: I1122 09:18:26.794434 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xw7p" event={"ID":"9edf4077-fb2b-42e9-8fb4-089d14519da9","Type":"ContainerStarted","Data":"87432ec1c5050c92532e64c7f952a5a3f02c6283fb7b0f6ebaae528983d3e228"} Nov 22 09:18:27 crc kubenswrapper[4846]: I1122 09:18:27.814181 4846 generic.go:334] "Generic (PLEG): container finished" podID="9edf4077-fb2b-42e9-8fb4-089d14519da9" containerID="a85f416a076d202c5df40a1cdc86fb5c091d30b0c557cd2409825cc308124592" exitCode=0 Nov 22 09:18:27 crc kubenswrapper[4846]: I1122 09:18:27.814287 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xw7p" event={"ID":"9edf4077-fb2b-42e9-8fb4-089d14519da9","Type":"ContainerDied","Data":"a85f416a076d202c5df40a1cdc86fb5c091d30b0c557cd2409825cc308124592"} Nov 22 09:18:27 crc kubenswrapper[4846]: I1122 09:18:27.883224 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z55r4"] Nov 22 09:18:27 crc kubenswrapper[4846]: I1122 09:18:27.884253 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:27 crc kubenswrapper[4846]: I1122 09:18:27.886530 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 09:18:27 crc kubenswrapper[4846]: I1122 09:18:27.892451 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z55r4"] Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.037376 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256ed2d8-7444-4001-ae7c-2592adcb4e72-catalog-content\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.038171 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlts9\" (UniqueName: \"kubernetes.io/projected/256ed2d8-7444-4001-ae7c-2592adcb4e72-kube-api-access-nlts9\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.038374 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256ed2d8-7444-4001-ae7c-2592adcb4e72-utilities\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.089835 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8nbgq"] Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.091301 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.094566 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.097008 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nbgq"] Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139166 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlts9\" (UniqueName: \"kubernetes.io/projected/256ed2d8-7444-4001-ae7c-2592adcb4e72-kube-api-access-nlts9\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139228 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdm5l\" (UniqueName: \"kubernetes.io/projected/38303d50-92e8-4134-9869-52964f9d76f0-kube-api-access-kdm5l\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139255 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38303d50-92e8-4134-9869-52964f9d76f0-utilities\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139289 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256ed2d8-7444-4001-ae7c-2592adcb4e72-utilities\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139340 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256ed2d8-7444-4001-ae7c-2592adcb4e72-catalog-content\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139358 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38303d50-92e8-4134-9869-52964f9d76f0-catalog-content\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139739 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256ed2d8-7444-4001-ae7c-2592adcb4e72-utilities\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.139794 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256ed2d8-7444-4001-ae7c-2592adcb4e72-catalog-content\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.166685 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlts9\" (UniqueName: \"kubernetes.io/projected/256ed2d8-7444-4001-ae7c-2592adcb4e72-kube-api-access-nlts9\") pod \"redhat-operators-z55r4\" (UID: \"256ed2d8-7444-4001-ae7c-2592adcb4e72\") " pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.203785 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.239993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdm5l\" (UniqueName: \"kubernetes.io/projected/38303d50-92e8-4134-9869-52964f9d76f0-kube-api-access-kdm5l\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.240129 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38303d50-92e8-4134-9869-52964f9d76f0-utilities\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.240173 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38303d50-92e8-4134-9869-52964f9d76f0-catalog-content\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.240816 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38303d50-92e8-4134-9869-52964f9d76f0-catalog-content\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.240943 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38303d50-92e8-4134-9869-52964f9d76f0-utilities\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.265588 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdm5l\" (UniqueName: \"kubernetes.io/projected/38303d50-92e8-4134-9869-52964f9d76f0-kube-api-access-kdm5l\") pod \"community-operators-8nbgq\" (UID: \"38303d50-92e8-4134-9869-52964f9d76f0\") " pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.409713 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.595832 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nbgq"] Nov 22 09:18:28 crc kubenswrapper[4846]: W1122 09:18:28.620181 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38303d50_92e8_4134_9869_52964f9d76f0.slice/crio-c8012e1e6b176477c4058101effe3a303ec9066d530deb448b2943e485d81f68 WatchSource:0}: Error finding container c8012e1e6b176477c4058101effe3a303ec9066d530deb448b2943e485d81f68: Status 404 returned error can't find the container with id c8012e1e6b176477c4058101effe3a303ec9066d530deb448b2943e485d81f68 Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.646321 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z55r4"] Nov 22 09:18:28 crc kubenswrapper[4846]: W1122 09:18:28.652079 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256ed2d8_7444_4001_ae7c_2592adcb4e72.slice/crio-449c8a4d661d65f2f69d8c51b3390725b94a8e94fbec53f0b5d238049b492853 WatchSource:0}: Error finding container 449c8a4d661d65f2f69d8c51b3390725b94a8e94fbec53f0b5d238049b492853: Status 404 returned error can't find the container with id 449c8a4d661d65f2f69d8c51b3390725b94a8e94fbec53f0b5d238049b492853 Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.823371 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xw7p" event={"ID":"9edf4077-fb2b-42e9-8fb4-089d14519da9","Type":"ContainerStarted","Data":"940de7ac5061077cbf63f20ac56a432069aaaad12ed7c37f89528a2b468849f5"} Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.826960 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z55r4" event={"ID":"256ed2d8-7444-4001-ae7c-2592adcb4e72","Type":"ContainerStarted","Data":"449c8a4d661d65f2f69d8c51b3390725b94a8e94fbec53f0b5d238049b492853"} Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.828202 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nbgq" event={"ID":"38303d50-92e8-4134-9869-52964f9d76f0","Type":"ContainerStarted","Data":"c8012e1e6b176477c4058101effe3a303ec9066d530deb448b2943e485d81f68"} Nov 22 09:18:28 crc kubenswrapper[4846]: I1122 09:18:28.844221 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8xw7p" podStartSLOduration=2.417907806 podStartE2EDuration="3.844195138s" podCreationTimestamp="2025-11-22 09:18:25 +0000 UTC" firstStartedPulling="2025-11-22 09:18:26.79815188 +0000 UTC m=+281.733841529" lastFinishedPulling="2025-11-22 09:18:28.224439212 +0000 UTC m=+283.160128861" observedRunningTime="2025-11-22 09:18:28.840707935 +0000 UTC m=+283.776397604" watchObservedRunningTime="2025-11-22 09:18:28.844195138 +0000 UTC m=+283.779884787" Nov 22 09:18:29 crc kubenswrapper[4846]: I1122 09:18:29.835707 4846 generic.go:334] "Generic (PLEG): container finished" podID="256ed2d8-7444-4001-ae7c-2592adcb4e72" containerID="59b7f8af169b4934b519de5c0ed3f105be60daf4984668473eec6f2480c62376" exitCode=0 Nov 22 09:18:29 crc kubenswrapper[4846]: I1122 09:18:29.836012 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z55r4" event={"ID":"256ed2d8-7444-4001-ae7c-2592adcb4e72","Type":"ContainerDied","Data":"59b7f8af169b4934b519de5c0ed3f105be60daf4984668473eec6f2480c62376"} Nov 22 09:18:29 crc kubenswrapper[4846]: I1122 09:18:29.837258 4846 generic.go:334] "Generic (PLEG): container finished" podID="38303d50-92e8-4134-9869-52964f9d76f0" containerID="898ba89fc422918d544fb8c7cc70a29fb654eef41b0d1a41ef04175eec64337f" exitCode=0 Nov 22 09:18:29 crc kubenswrapper[4846]: I1122 09:18:29.837294 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nbgq" event={"ID":"38303d50-92e8-4134-9869-52964f9d76f0","Type":"ContainerDied","Data":"898ba89fc422918d544fb8c7cc70a29fb654eef41b0d1a41ef04175eec64337f"} Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.297671 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bxljw"] Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.299215 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.301920 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.303306 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxljw"] Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.470068 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qkd\" (UniqueName: \"kubernetes.io/projected/17186de5-6faa-416f-a138-e32ed89d2ad5-kube-api-access-58qkd\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.470136 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17186de5-6faa-416f-a138-e32ed89d2ad5-utilities\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.470173 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17186de5-6faa-416f-a138-e32ed89d2ad5-catalog-content\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.570972 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qkd\" (UniqueName: \"kubernetes.io/projected/17186de5-6faa-416f-a138-e32ed89d2ad5-kube-api-access-58qkd\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.571077 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17186de5-6faa-416f-a138-e32ed89d2ad5-utilities\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.571122 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17186de5-6faa-416f-a138-e32ed89d2ad5-catalog-content\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.571674 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17186de5-6faa-416f-a138-e32ed89d2ad5-catalog-content\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.571759 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17186de5-6faa-416f-a138-e32ed89d2ad5-utilities\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.592813 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qkd\" (UniqueName: \"kubernetes.io/projected/17186de5-6faa-416f-a138-e32ed89d2ad5-kube-api-access-58qkd\") pod \"redhat-marketplace-bxljw\" (UID: \"17186de5-6faa-416f-a138-e32ed89d2ad5\") " pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:30 crc kubenswrapper[4846]: I1122 09:18:30.656595 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.444851 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxljw"] Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.849422 4846 generic.go:334] "Generic (PLEG): container finished" podID="38303d50-92e8-4134-9869-52964f9d76f0" containerID="ee44b4869f484b7f14653b454272c98701b8e3f18c36eb562653d3754eea4da8" exitCode=0 Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.849523 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nbgq" event={"ID":"38303d50-92e8-4134-9869-52964f9d76f0","Type":"ContainerDied","Data":"ee44b4869f484b7f14653b454272c98701b8e3f18c36eb562653d3754eea4da8"} Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.852907 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z55r4" event={"ID":"256ed2d8-7444-4001-ae7c-2592adcb4e72","Type":"ContainerStarted","Data":"d0e54eb0de38e1d239858ce1445aba6b94417094bcdcc0ac999deb5086750f34"} Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.862523 4846 generic.go:334] "Generic (PLEG): container finished" podID="17186de5-6faa-416f-a138-e32ed89d2ad5" containerID="3291431be4631e33715dfbd58fef8a6e75616ae56793645f54ebaa55f09a797c" exitCode=0 Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.862584 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxljw" event={"ID":"17186de5-6faa-416f-a138-e32ed89d2ad5","Type":"ContainerDied","Data":"3291431be4631e33715dfbd58fef8a6e75616ae56793645f54ebaa55f09a797c"} Nov 22 09:18:31 crc kubenswrapper[4846]: I1122 09:18:31.862618 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxljw" event={"ID":"17186de5-6faa-416f-a138-e32ed89d2ad5","Type":"ContainerStarted","Data":"3d178213fff066f6405fc31f17cd6fef70d134cb2eda564ae3ac73c5c1f8db4f"} Nov 22 09:18:32 crc kubenswrapper[4846]: I1122 09:18:32.869850 4846 generic.go:334] "Generic (PLEG): container finished" podID="256ed2d8-7444-4001-ae7c-2592adcb4e72" containerID="d0e54eb0de38e1d239858ce1445aba6b94417094bcdcc0ac999deb5086750f34" exitCode=0 Nov 22 09:18:32 crc kubenswrapper[4846]: I1122 09:18:32.869942 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z55r4" event={"ID":"256ed2d8-7444-4001-ae7c-2592adcb4e72","Type":"ContainerDied","Data":"d0e54eb0de38e1d239858ce1445aba6b94417094bcdcc0ac999deb5086750f34"} Nov 22 09:18:33 crc kubenswrapper[4846]: I1122 09:18:33.878844 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nbgq" event={"ID":"38303d50-92e8-4134-9869-52964f9d76f0","Type":"ContainerStarted","Data":"975a731a707a74236162df7a5b77c41ef962616b1c30ae74b85e71b91314fa18"} Nov 22 09:18:34 crc kubenswrapper[4846]: I1122 09:18:34.904466 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8nbgq" podStartSLOduration=4.212523716 podStartE2EDuration="6.90444082s" podCreationTimestamp="2025-11-22 09:18:28 +0000 UTC" firstStartedPulling="2025-11-22 09:18:29.838861042 +0000 UTC m=+284.774550681" lastFinishedPulling="2025-11-22 09:18:32.530778136 +0000 UTC m=+287.466467785" observedRunningTime="2025-11-22 09:18:34.901937407 +0000 UTC m=+289.837627066" watchObservedRunningTime="2025-11-22 09:18:34.90444082 +0000 UTC m=+289.840130459" Nov 22 09:18:35 crc kubenswrapper[4846]: I1122 09:18:35.890251 4846 generic.go:334] "Generic (PLEG): container finished" podID="17186de5-6faa-416f-a138-e32ed89d2ad5" containerID="b1a732a93b1d62b9203ad5a1ff73375f69b3d6c124a551aec5b61e2ada1dd6d8" exitCode=0 Nov 22 09:18:35 crc kubenswrapper[4846]: I1122 09:18:35.890333 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxljw" event={"ID":"17186de5-6faa-416f-a138-e32ed89d2ad5","Type":"ContainerDied","Data":"b1a732a93b1d62b9203ad5a1ff73375f69b3d6c124a551aec5b61e2ada1dd6d8"} Nov 22 09:18:36 crc kubenswrapper[4846]: I1122 09:18:36.015850 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:36 crc kubenswrapper[4846]: I1122 09:18:36.017451 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:36 crc kubenswrapper[4846]: I1122 09:18:36.060685 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:36 crc kubenswrapper[4846]: I1122 09:18:36.899149 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z55r4" event={"ID":"256ed2d8-7444-4001-ae7c-2592adcb4e72","Type":"ContainerStarted","Data":"d6df5cd9fdaea186d93726ed703b1e5391fcd022751cfc189fce6aa3deaf82f7"} Nov 22 09:18:36 crc kubenswrapper[4846]: I1122 09:18:36.917488 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z55r4" podStartSLOduration=4.014405762 podStartE2EDuration="9.917465416s" podCreationTimestamp="2025-11-22 09:18:27 +0000 UTC" firstStartedPulling="2025-11-22 09:18:29.838378638 +0000 UTC m=+284.774068287" lastFinishedPulling="2025-11-22 09:18:35.741438252 +0000 UTC m=+290.677127941" observedRunningTime="2025-11-22 09:18:36.916185068 +0000 UTC m=+291.851874747" watchObservedRunningTime="2025-11-22 09:18:36.917465416 +0000 UTC m=+291.853155065" Nov 22 09:18:36 crc kubenswrapper[4846]: I1122 09:18:36.952232 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8xw7p" Nov 22 09:18:38 crc kubenswrapper[4846]: I1122 09:18:38.204096 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:38 crc kubenswrapper[4846]: I1122 09:18:38.204641 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:38 crc kubenswrapper[4846]: I1122 09:18:38.410349 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:38 crc kubenswrapper[4846]: I1122 09:18:38.410429 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:38 crc kubenswrapper[4846]: I1122 09:18:38.447504 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:38 crc kubenswrapper[4846]: I1122 09:18:38.954038 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8nbgq" Nov 22 09:18:39 crc kubenswrapper[4846]: I1122 09:18:39.256422 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z55r4" podUID="256ed2d8-7444-4001-ae7c-2592adcb4e72" containerName="registry-server" probeResult="failure" output=< Nov 22 09:18:39 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:18:39 crc kubenswrapper[4846]: > Nov 22 09:18:46 crc kubenswrapper[4846]: I1122 09:18:46.964132 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxljw" event={"ID":"17186de5-6faa-416f-a138-e32ed89d2ad5","Type":"ContainerStarted","Data":"65b51262205e02204d79e342bc0d45d96d078e33bcf8897cc06707f170ab0f11"} Nov 22 09:18:46 crc kubenswrapper[4846]: I1122 09:18:46.989987 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bxljw" podStartSLOduration=4.426708124 podStartE2EDuration="16.989965754s" podCreationTimestamp="2025-11-22 09:18:30 +0000 UTC" firstStartedPulling="2025-11-22 09:18:31.868330832 +0000 UTC m=+286.804020481" lastFinishedPulling="2025-11-22 09:18:44.431588472 +0000 UTC m=+299.367278111" observedRunningTime="2025-11-22 09:18:46.987336536 +0000 UTC m=+301.923026195" watchObservedRunningTime="2025-11-22 09:18:46.989965754 +0000 UTC m=+301.925655403" Nov 22 09:18:48 crc kubenswrapper[4846]: I1122 09:18:48.243903 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:48 crc kubenswrapper[4846]: I1122 09:18:48.282074 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z55r4" Nov 22 09:18:50 crc kubenswrapper[4846]: I1122 09:18:50.657763 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:50 crc kubenswrapper[4846]: I1122 09:18:50.657813 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:18:50 crc kubenswrapper[4846]: I1122 09:18:50.693203 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:19:00 crc kubenswrapper[4846]: I1122 09:19:00.711759 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bxljw" Nov 22 09:19:58 crc kubenswrapper[4846]: I1122 09:19:58.625980 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:19:58 crc kubenswrapper[4846]: I1122 09:19:58.626666 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:20:28 crc kubenswrapper[4846]: I1122 09:20:28.625500 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:20:28 crc kubenswrapper[4846]: I1122 09:20:28.626512 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.625748 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.626611 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.627261 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.627858 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9d1e242bde74884effedf6ed226573341c9a217ba1fc454c2f0c977522434d6"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.627919 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://e9d1e242bde74884effedf6ed226573341c9a217ba1fc454c2f0c977522434d6" gracePeriod=600 Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.772660 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="e9d1e242bde74884effedf6ed226573341c9a217ba1fc454c2f0c977522434d6" exitCode=0 Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.772710 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"e9d1e242bde74884effedf6ed226573341c9a217ba1fc454c2f0c977522434d6"} Nov 22 09:20:58 crc kubenswrapper[4846]: I1122 09:20:58.772747 4846 scope.go:117] "RemoveContainer" containerID="8efd6efa9c9237b02c9a4ec5f9005b2b2f4fd9bfc89074196ce13524210b9e50" Nov 22 09:20:59 crc kubenswrapper[4846]: I1122 09:20:59.784824 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"2d6e3598b04fea951b6da83a54c8c53b23be887c3070db956e053d32e85f6afe"} Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.666408 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sf6mv"] Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.668723 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.677223 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sf6mv"] Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.843759 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dd9930a-5262-4b82-9a3c-7677f5714253-trusted-ca\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.843816 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-bound-sa-token\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.843869 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.844480 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6dd9930a-5262-4b82-9a3c-7677f5714253-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.844532 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6dd9930a-5262-4b82-9a3c-7677f5714253-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.844561 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79k7r\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-kube-api-access-79k7r\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.844624 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-registry-tls\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.844653 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6dd9930a-5262-4b82-9a3c-7677f5714253-registry-certificates\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.867409 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.945868 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6dd9930a-5262-4b82-9a3c-7677f5714253-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.945947 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79k7r\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-kube-api-access-79k7r\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.946004 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-registry-tls\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.946067 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6dd9930a-5262-4b82-9a3c-7677f5714253-registry-certificates\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.946195 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dd9930a-5262-4b82-9a3c-7677f5714253-trusted-ca\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.946249 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-bound-sa-token\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.946319 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6dd9930a-5262-4b82-9a3c-7677f5714253-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.946721 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6dd9930a-5262-4b82-9a3c-7677f5714253-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.947679 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dd9930a-5262-4b82-9a3c-7677f5714253-trusted-ca\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.948122 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6dd9930a-5262-4b82-9a3c-7677f5714253-registry-certificates\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.953690 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-registry-tls\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.953712 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6dd9930a-5262-4b82-9a3c-7677f5714253-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.968442 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79k7r\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-kube-api-access-79k7r\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.968964 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6dd9930a-5262-4b82-9a3c-7677f5714253-bound-sa-token\") pod \"image-registry-66df7c8f76-sf6mv\" (UID: \"6dd9930a-5262-4b82-9a3c-7677f5714253\") " pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:34 crc kubenswrapper[4846]: I1122 09:21:34.994988 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:35 crc kubenswrapper[4846]: I1122 09:21:35.206980 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sf6mv"] Nov 22 09:21:36 crc kubenswrapper[4846]: I1122 09:21:36.044205 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" event={"ID":"6dd9930a-5262-4b82-9a3c-7677f5714253","Type":"ContainerStarted","Data":"3690ac06461cc333d0c83c868b8d409153e9300f5be3c2072b5236d77cbda152"} Nov 22 09:21:36 crc kubenswrapper[4846]: I1122 09:21:36.044635 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" event={"ID":"6dd9930a-5262-4b82-9a3c-7677f5714253","Type":"ContainerStarted","Data":"e2a6d509f9db708132c2a37e383d1ca20f3688ab1ab6f547ad7ebfb566cf627b"} Nov 22 09:21:36 crc kubenswrapper[4846]: I1122 09:21:36.044652 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:36 crc kubenswrapper[4846]: I1122 09:21:36.098007 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" podStartSLOduration=2.097978471 podStartE2EDuration="2.097978471s" podCreationTimestamp="2025-11-22 09:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:21:36.093223121 +0000 UTC m=+471.028912800" watchObservedRunningTime="2025-11-22 09:21:36.097978471 +0000 UTC m=+471.033668120" Nov 22 09:21:55 crc kubenswrapper[4846]: I1122 09:21:55.000225 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-sf6mv" Nov 22 09:21:55 crc kubenswrapper[4846]: I1122 09:21:55.088370 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nfrf8"] Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.143566 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" podUID="92822bda-884a-4bfc-b651-f58624599346" containerName="registry" containerID="cri-o://77932a7e98b3339e204a32539f23d75b76d8eaae2400997125a7ab88f76aa9a1" gracePeriod=30 Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.339057 4846 generic.go:334] "Generic (PLEG): container finished" podID="92822bda-884a-4bfc-b651-f58624599346" containerID="77932a7e98b3339e204a32539f23d75b76d8eaae2400997125a7ab88f76aa9a1" exitCode=0 Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.339094 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" event={"ID":"92822bda-884a-4bfc-b651-f58624599346","Type":"ContainerDied","Data":"77932a7e98b3339e204a32539f23d75b76d8eaae2400997125a7ab88f76aa9a1"} Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.544620 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649333 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-kube-api-access-p5482\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649454 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92822bda-884a-4bfc-b651-f58624599346-installation-pull-secrets\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649502 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-trusted-ca\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649757 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649795 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-registry-tls\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649867 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-registry-certificates\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649910 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-bound-sa-token\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.649956 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92822bda-884a-4bfc-b651-f58624599346-ca-trust-extracted\") pod \"92822bda-884a-4bfc-b651-f58624599346\" (UID: \"92822bda-884a-4bfc-b651-f58624599346\") " Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.651505 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.651572 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.657433 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.658956 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92822bda-884a-4bfc-b651-f58624599346-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.658984 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-kube-api-access-p5482" (OuterVolumeSpecName: "kube-api-access-p5482") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "kube-api-access-p5482". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.659372 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.665023 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.669357 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92822bda-884a-4bfc-b651-f58624599346-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "92822bda-884a-4bfc-b651-f58624599346" (UID: "92822bda-884a-4bfc-b651-f58624599346"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751838 4846 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751897 4846 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751910 4846 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/92822bda-884a-4bfc-b651-f58624599346-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751919 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5482\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-kube-api-access-p5482\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751934 4846 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/92822bda-884a-4bfc-b651-f58624599346-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751943 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92822bda-884a-4bfc-b651-f58624599346-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:20 crc kubenswrapper[4846]: I1122 09:22:20.751952 4846 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/92822bda-884a-4bfc-b651-f58624599346-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:22:21 crc kubenswrapper[4846]: I1122 09:22:21.349530 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" event={"ID":"92822bda-884a-4bfc-b651-f58624599346","Type":"ContainerDied","Data":"f811e52055a7ead2989ef643901f3531923566c22d00322ae616c67e6b33fcca"} Nov 22 09:22:21 crc kubenswrapper[4846]: I1122 09:22:21.349625 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nfrf8" Nov 22 09:22:21 crc kubenswrapper[4846]: I1122 09:22:21.350239 4846 scope.go:117] "RemoveContainer" containerID="77932a7e98b3339e204a32539f23d75b76d8eaae2400997125a7ab88f76aa9a1" Nov 22 09:22:21 crc kubenswrapper[4846]: I1122 09:22:21.395705 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nfrf8"] Nov 22 09:22:21 crc kubenswrapper[4846]: I1122 09:22:21.401232 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nfrf8"] Nov 22 09:22:22 crc kubenswrapper[4846]: I1122 09:22:22.044396 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92822bda-884a-4bfc-b651-f58624599346" path="/var/lib/kubelet/pods/92822bda-884a-4bfc-b651-f58624599346/volumes" Nov 22 09:22:46 crc kubenswrapper[4846]: I1122 09:22:46.197788 4846 scope.go:117] "RemoveContainer" containerID="e4b01192a1a9e26366ede83d1d199bfc371ddeb30b5a974e112f367f1abcf317" Nov 22 09:22:58 crc kubenswrapper[4846]: I1122 09:22:58.625609 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:22:58 crc kubenswrapper[4846]: I1122 09:22:58.626408 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:23:28 crc kubenswrapper[4846]: I1122 09:23:28.626144 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:23:28 crc kubenswrapper[4846]: I1122 09:23:28.627232 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:23:46 crc kubenswrapper[4846]: I1122 09:23:46.238909 4846 scope.go:117] "RemoveContainer" containerID="6d5975245388b44d660726017c614b92a237cac750c99aefb11fc3cfb8605100" Nov 22 09:23:46 crc kubenswrapper[4846]: I1122 09:23:46.272480 4846 scope.go:117] "RemoveContainer" containerID="4a6fd49ba054638e1afde2921a374769bd7d81c2f8930f755532160fee682c4a" Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.626486 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.628289 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.628428 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.629472 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d6e3598b04fea951b6da83a54c8c53b23be887c3070db956e053d32e85f6afe"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.629558 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://2d6e3598b04fea951b6da83a54c8c53b23be887c3070db956e053d32e85f6afe" gracePeriod=600 Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.992949 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="2d6e3598b04fea951b6da83a54c8c53b23be887c3070db956e053d32e85f6afe" exitCode=0 Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.993052 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"2d6e3598b04fea951b6da83a54c8c53b23be887c3070db956e053d32e85f6afe"} Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.993682 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"98f9262a8d10b551be9acdbca7c91a24b8c83945ea853c86e2932b08cb27780b"} Nov 22 09:23:58 crc kubenswrapper[4846]: I1122 09:23:58.993726 4846 scope.go:117] "RemoveContainer" containerID="e9d1e242bde74884effedf6ed226573341c9a217ba1fc454c2f0c977522434d6" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.782349 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d4jkh"] Nov 22 09:24:55 crc kubenswrapper[4846]: E1122 09:24:55.783292 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92822bda-884a-4bfc-b651-f58624599346" containerName="registry" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.783306 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="92822bda-884a-4bfc-b651-f58624599346" containerName="registry" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.783407 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="92822bda-884a-4bfc-b651-f58624599346" containerName="registry" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.783807 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.791662 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.792080 4846 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qw82h" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.792280 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.795433 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nb9wl"] Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.796298 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nb9wl" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.800458 4846 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8b94d" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.815200 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d4jkh"] Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.826215 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nb9wl"] Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.856724 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mppx8"] Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.857750 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.859568 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rz6\" (UniqueName: \"kubernetes.io/projected/1d9cea2b-9f89-437e-a0f3-875b123a47d3-kube-api-access-x4rz6\") pod \"cert-manager-5b446d88c5-nb9wl\" (UID: \"1d9cea2b-9f89-437e-a0f3-875b123a47d3\") " pod="cert-manager/cert-manager-5b446d88c5-nb9wl" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.859613 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6dd\" (UniqueName: \"kubernetes.io/projected/c55358d6-9876-4e6a-9b06-08db6080a803-kube-api-access-tk6dd\") pod \"cert-manager-cainjector-7f985d654d-d4jkh\" (UID: \"c55358d6-9876-4e6a-9b06-08db6080a803\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.860647 4846 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-v8rvd" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.863123 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mppx8"] Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.960524 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rz6\" (UniqueName: \"kubernetes.io/projected/1d9cea2b-9f89-437e-a0f3-875b123a47d3-kube-api-access-x4rz6\") pod \"cert-manager-5b446d88c5-nb9wl\" (UID: \"1d9cea2b-9f89-437e-a0f3-875b123a47d3\") " pod="cert-manager/cert-manager-5b446d88c5-nb9wl" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.960578 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6dd\" (UniqueName: \"kubernetes.io/projected/c55358d6-9876-4e6a-9b06-08db6080a803-kube-api-access-tk6dd\") pod \"cert-manager-cainjector-7f985d654d-d4jkh\" (UID: \"c55358d6-9876-4e6a-9b06-08db6080a803\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.960608 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26l4x\" (UniqueName: \"kubernetes.io/projected/27cc0714-ab99-4ddc-9e9c-66f24bba9fac-kube-api-access-26l4x\") pod \"cert-manager-webhook-5655c58dd6-mppx8\" (UID: \"27cc0714-ab99-4ddc-9e9c-66f24bba9fac\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.982634 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6dd\" (UniqueName: \"kubernetes.io/projected/c55358d6-9876-4e6a-9b06-08db6080a803-kube-api-access-tk6dd\") pod \"cert-manager-cainjector-7f985d654d-d4jkh\" (UID: \"c55358d6-9876-4e6a-9b06-08db6080a803\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" Nov 22 09:24:55 crc kubenswrapper[4846]: I1122 09:24:55.983199 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rz6\" (UniqueName: \"kubernetes.io/projected/1d9cea2b-9f89-437e-a0f3-875b123a47d3-kube-api-access-x4rz6\") pod \"cert-manager-5b446d88c5-nb9wl\" (UID: \"1d9cea2b-9f89-437e-a0f3-875b123a47d3\") " pod="cert-manager/cert-manager-5b446d88c5-nb9wl" Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.062417 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26l4x\" (UniqueName: \"kubernetes.io/projected/27cc0714-ab99-4ddc-9e9c-66f24bba9fac-kube-api-access-26l4x\") pod \"cert-manager-webhook-5655c58dd6-mppx8\" (UID: \"27cc0714-ab99-4ddc-9e9c-66f24bba9fac\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.080972 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26l4x\" (UniqueName: \"kubernetes.io/projected/27cc0714-ab99-4ddc-9e9c-66f24bba9fac-kube-api-access-26l4x\") pod \"cert-manager-webhook-5655c58dd6-mppx8\" (UID: \"27cc0714-ab99-4ddc-9e9c-66f24bba9fac\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.107844 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.132602 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nb9wl" Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.179476 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.362490 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d4jkh"] Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.385242 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.416487 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nb9wl"] Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.421825 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" event={"ID":"c55358d6-9876-4e6a-9b06-08db6080a803","Type":"ContainerStarted","Data":"87e3028a0695950db4f0e669970ac7d8509510afd66340b400a80c9e57c64914"} Nov 22 09:24:56 crc kubenswrapper[4846]: W1122 09:24:56.423666 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d9cea2b_9f89_437e_a0f3_875b123a47d3.slice/crio-8d943aefc640d0cd1d52c08d0a9229636d037490091ca312731022972fffe566 WatchSource:0}: Error finding container 8d943aefc640d0cd1d52c08d0a9229636d037490091ca312731022972fffe566: Status 404 returned error can't find the container with id 8d943aefc640d0cd1d52c08d0a9229636d037490091ca312731022972fffe566 Nov 22 09:24:56 crc kubenswrapper[4846]: I1122 09:24:56.442893 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mppx8"] Nov 22 09:24:57 crc kubenswrapper[4846]: I1122 09:24:57.428489 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nb9wl" event={"ID":"1d9cea2b-9f89-437e-a0f3-875b123a47d3","Type":"ContainerStarted","Data":"8d943aefc640d0cd1d52c08d0a9229636d037490091ca312731022972fffe566"} Nov 22 09:24:57 crc kubenswrapper[4846]: I1122 09:24:57.429699 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" event={"ID":"27cc0714-ab99-4ddc-9e9c-66f24bba9fac","Type":"ContainerStarted","Data":"e80279e9e1d2934ecbe473a97382470bd32de837c9c9126c2004426c1b8f1d1e"} Nov 22 09:24:59 crc kubenswrapper[4846]: I1122 09:24:59.445260 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" event={"ID":"c55358d6-9876-4e6a-9b06-08db6080a803","Type":"ContainerStarted","Data":"a063b6d63f02f1fd963491cc26b0130286fabc488b546fb456f573304d9297dd"} Nov 22 09:24:59 crc kubenswrapper[4846]: I1122 09:24:59.464608 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-d4jkh" podStartSLOduration=2.551832271 podStartE2EDuration="4.464579633s" podCreationTimestamp="2025-11-22 09:24:55 +0000 UTC" firstStartedPulling="2025-11-22 09:24:56.384748469 +0000 UTC m=+671.320438118" lastFinishedPulling="2025-11-22 09:24:58.297495821 +0000 UTC m=+673.233185480" observedRunningTime="2025-11-22 09:24:59.460980718 +0000 UTC m=+674.396670387" watchObservedRunningTime="2025-11-22 09:24:59.464579633 +0000 UTC m=+674.400269282" Nov 22 09:25:00 crc kubenswrapper[4846]: I1122 09:25:00.454644 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" event={"ID":"27cc0714-ab99-4ddc-9e9c-66f24bba9fac","Type":"ContainerStarted","Data":"7bc8975d5f2e85c60e62b5bd5b67fe1045a6faf8f4ee3a718684c9f7e928b91b"} Nov 22 09:25:00 crc kubenswrapper[4846]: I1122 09:25:00.454807 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:25:00 crc kubenswrapper[4846]: I1122 09:25:00.457509 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nb9wl" event={"ID":"1d9cea2b-9f89-437e-a0f3-875b123a47d3","Type":"ContainerStarted","Data":"3ac864454c8bc0dda8f3b7e7c3235823e18591a5b2c9cf8b3f01915d2ffd4744"} Nov 22 09:25:00 crc kubenswrapper[4846]: I1122 09:25:00.472176 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" podStartSLOduration=1.970503465 podStartE2EDuration="5.472150914s" podCreationTimestamp="2025-11-22 09:24:55 +0000 UTC" firstStartedPulling="2025-11-22 09:24:56.451851502 +0000 UTC m=+671.387541151" lastFinishedPulling="2025-11-22 09:24:59.953498911 +0000 UTC m=+674.889188600" observedRunningTime="2025-11-22 09:25:00.471528715 +0000 UTC m=+675.407218364" watchObservedRunningTime="2025-11-22 09:25:00.472150914 +0000 UTC m=+675.407840573" Nov 22 09:25:00 crc kubenswrapper[4846]: I1122 09:25:00.490755 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-nb9wl" podStartSLOduration=1.8965214019999999 podStartE2EDuration="5.490733384s" podCreationTimestamp="2025-11-22 09:24:55 +0000 UTC" firstStartedPulling="2025-11-22 09:24:56.426791423 +0000 UTC m=+671.362481072" lastFinishedPulling="2025-11-22 09:25:00.021003405 +0000 UTC m=+674.956693054" observedRunningTime="2025-11-22 09:25:00.49057524 +0000 UTC m=+675.426264889" watchObservedRunningTime="2025-11-22 09:25:00.490733384 +0000 UTC m=+675.426423033" Nov 22 09:25:06 crc kubenswrapper[4846]: I1122 09:25:06.182835 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-mppx8" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399280 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kws67"] Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399720 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-controller" containerID="cri-o://9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399846 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="northd" containerID="cri-o://ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399940 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="sbdb" containerID="cri-o://207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399889 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-node" containerID="cri-o://d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399984 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="nbdb" containerID="cri-o://81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.399988 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.400377 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-acl-logging" containerID="cri-o://fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.515929 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" containerID="cri-o://4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" gracePeriod=30 Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.855315 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/3.log" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.858519 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovn-acl-logging/0.log" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.859098 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovn-controller/0.log" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.859600 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.930824 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-var-lib-openvswitch\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.930885 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-log-socket\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.930929 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-env-overrides\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.930946 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-config\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.930963 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-script-lib\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.930954 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931009 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-node-log\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931040 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-ovn\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931079 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-ovn-kubernetes\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931107 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-netd\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931136 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scw9m\" (UniqueName: \"kubernetes.io/projected/c874da16-5eda-477e-bbd5-e5c105dc7a07-kube-api-access-scw9m\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931158 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-openvswitch\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931177 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-kubelet\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931215 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-netns\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931245 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-systemd-units\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931268 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931290 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovn-node-metrics-cert\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931318 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-slash\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931336 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-systemd\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931352 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-etc-openvswitch\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931367 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-bin\") pod \"c874da16-5eda-477e-bbd5-e5c105dc7a07\" (UID: \"c874da16-5eda-477e-bbd5-e5c105dc7a07\") " Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931613 4846 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931646 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931680 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931710 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931723 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931726 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931729 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931759 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-node-log" (OuterVolumeSpecName: "node-log") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931751 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931768 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931769 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931828 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931864 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931902 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931933 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-slash" (OuterVolumeSpecName: "host-slash") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.931965 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.934588 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-log-socket" (OuterVolumeSpecName: "log-socket") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.939014 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.940827 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c874da16-5eda-477e-bbd5-e5c105dc7a07-kube-api-access-scw9m" (OuterVolumeSpecName: "kube-api-access-scw9m") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "kube-api-access-scw9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.943346 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qr7sx"] Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.943662 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.943685 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.943699 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kubecfg-setup" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.943909 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kubecfg-setup" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.943923 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-acl-logging" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.943932 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-acl-logging" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.943946 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.943954 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.943967 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="nbdb" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.943977 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="nbdb" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.943994 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="sbdb" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944002 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="sbdb" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944010 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-node" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944018 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-node" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944029 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944038 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944067 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944075 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944085 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="northd" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944092 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="northd" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944102 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944112 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944234 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944247 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="nbdb" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944260 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944269 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="sbdb" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944283 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="northd" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944292 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944299 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="kube-rbac-proxy-node" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944308 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944322 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovn-acl-logging" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944331 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944481 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944490 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: E1122 09:25:07.944505 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.944514 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.947276 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.947331 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerName="ovnkube-controller" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.952466 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:07 crc kubenswrapper[4846]: I1122 09:25:07.970250 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c874da16-5eda-477e-bbd5-e5c105dc7a07" (UID: "c874da16-5eda-477e-bbd5-e5c105dc7a07"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033158 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-cni-netd\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033285 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033324 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovn-node-metrics-cert\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033361 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlnxr\" (UniqueName: \"kubernetes.io/projected/07a2dc04-672f-4599-80ad-c5499e84d1ef-kube-api-access-dlnxr\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033408 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-slash\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033555 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-env-overrides\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033631 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033667 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-kubelet\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033740 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-run-netns\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033818 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-log-socket\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.033906 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovnkube-script-lib\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034149 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-node-log\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034191 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-systemd-units\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034213 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-systemd\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034239 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovnkube-config\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034294 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-cni-bin\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034320 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-var-lib-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034343 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-ovn\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034391 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-etc-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034426 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034503 4846 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-log-socket\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034521 4846 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034535 4846 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034547 4846 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c874da16-5eda-477e-bbd5-e5c105dc7a07-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034558 4846 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-node-log\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034571 4846 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034587 4846 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034600 4846 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034613 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scw9m\" (UniqueName: \"kubernetes.io/projected/c874da16-5eda-477e-bbd5-e5c105dc7a07-kube-api-access-scw9m\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034629 4846 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034643 4846 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034658 4846 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034672 4846 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.034845 4846 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.035110 4846 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c874da16-5eda-477e-bbd5-e5c105dc7a07-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.035255 4846 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-slash\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.035277 4846 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.035293 4846 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.035306 4846 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c874da16-5eda-477e-bbd5-e5c105dc7a07-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136677 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-etc-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136728 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136774 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-cni-netd\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136797 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136833 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovn-node-metrics-cert\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136876 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlnxr\" (UniqueName: \"kubernetes.io/projected/07a2dc04-672f-4599-80ad-c5499e84d1ef-kube-api-access-dlnxr\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136921 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-slash\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136942 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-env-overrides\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136962 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136931 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-etc-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137024 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-kubelet\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.136977 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-kubelet\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137089 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137120 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-cni-netd\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137119 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-run-netns\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137201 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-log-socket\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137418 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovnkube-script-lib\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137419 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-slash\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137462 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-node-log\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137560 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-systemd-units\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137592 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-systemd\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137621 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovnkube-config\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137665 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-cni-bin\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137695 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-var-lib-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137719 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-ovn\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137827 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137875 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-cni-bin\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137849 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-ovn\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137917 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-var-lib-openvswitch\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137942 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-systemd-units\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137148 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137976 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-log-socket\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137987 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-env-overrides\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.138005 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-host-run-netns\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.137996 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-run-systemd\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.138103 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/07a2dc04-672f-4599-80ad-c5499e84d1ef-node-log\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.138417 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovnkube-config\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.138581 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovnkube-script-lib\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.141487 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/07a2dc04-672f-4599-80ad-c5499e84d1ef-ovn-node-metrics-cert\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.160295 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlnxr\" (UniqueName: \"kubernetes.io/projected/07a2dc04-672f-4599-80ad-c5499e84d1ef-kube-api-access-dlnxr\") pod \"ovnkube-node-qr7sx\" (UID: \"07a2dc04-672f-4599-80ad-c5499e84d1ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.270176 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:08 crc kubenswrapper[4846]: W1122 09:25:08.292866 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a2dc04_672f_4599_80ad_c5499e84d1ef.slice/crio-d0f8f9931a3cd9d750d345b17a40083a22d231628c902a62d822fbb25ca3d76f WatchSource:0}: Error finding container d0f8f9931a3cd9d750d345b17a40083a22d231628c902a62d822fbb25ca3d76f: Status 404 returned error can't find the container with id d0f8f9931a3cd9d750d345b17a40083a22d231628c902a62d822fbb25ca3d76f Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.507707 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovnkube-controller/3.log" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511113 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovn-acl-logging/0.log" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511603 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kws67_c874da16-5eda-477e-bbd5-e5c105dc7a07/ovn-controller/0.log" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511941 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511967 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511975 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511984 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511992 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.511998 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512010 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" exitCode=143 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512016 4846 generic.go:334] "Generic (PLEG): container finished" podID="c874da16-5eda-477e-bbd5-e5c105dc7a07" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" exitCode=143 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512037 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512158 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512163 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512195 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512211 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512224 4846 scope.go:117] "RemoveContainer" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512230 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512255 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512271 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512287 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512296 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512304 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512312 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512323 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512330 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512337 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512345 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512355 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512366 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512376 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512382 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512399 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512406 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512414 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512421 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512432 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512439 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512447 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512458 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512470 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512477 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512484 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512492 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512498 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512507 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512515 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512521 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512528 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512535 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512543 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kws67" event={"ID":"c874da16-5eda-477e-bbd5-e5c105dc7a07","Type":"ContainerDied","Data":"62735a5b6b9a38a4c4d8acd7e31b376cf49305316f3d97d331d51174828f3cf6"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512560 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512570 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512578 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512585 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512592 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512599 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512606 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512613 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512620 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.512626 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.517622 4846 generic.go:334] "Generic (PLEG): container finished" podID="07a2dc04-672f-4599-80ad-c5499e84d1ef" containerID="24be4b16b1445048058fceaca0514de8a3fc498cfb0ca553801290234bc65e63" exitCode=0 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.517708 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerDied","Data":"24be4b16b1445048058fceaca0514de8a3fc498cfb0ca553801290234bc65e63"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.517775 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"d0f8f9931a3cd9d750d345b17a40083a22d231628c902a62d822fbb25ca3d76f"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.520098 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/2.log" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.520912 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/1.log" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.520958 4846 generic.go:334] "Generic (PLEG): container finished" podID="9aec6a38-e6e4-4009-95d2-6a179c7fac04" containerID="2c9ecafae6b69b17dbedcb7f5d9e0c34ac261a6452f0276112bc86f9662471e7" exitCode=2 Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.520987 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerDied","Data":"2c9ecafae6b69b17dbedcb7f5d9e0c34ac261a6452f0276112bc86f9662471e7"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.521012 4846 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a"} Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.521425 4846 scope.go:117] "RemoveContainer" containerID="2c9ecafae6b69b17dbedcb7f5d9e0c34ac261a6452f0276112bc86f9662471e7" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.521591 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hbcs8_openshift-multus(9aec6a38-e6e4-4009-95d2-6a179c7fac04)\"" pod="openshift-multus/multus-hbcs8" podUID="9aec6a38-e6e4-4009-95d2-6a179c7fac04" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.539783 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kws67"] Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.544689 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kws67"] Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.574825 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.608425 4846 scope.go:117] "RemoveContainer" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.626678 4846 scope.go:117] "RemoveContainer" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.641845 4846 scope.go:117] "RemoveContainer" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.658304 4846 scope.go:117] "RemoveContainer" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.672295 4846 scope.go:117] "RemoveContainer" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.689656 4846 scope.go:117] "RemoveContainer" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.708784 4846 scope.go:117] "RemoveContainer" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.740624 4846 scope.go:117] "RemoveContainer" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.764860 4846 scope.go:117] "RemoveContainer" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.768121 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": container with ID starting with 4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575 not found: ID does not exist" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.768197 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} err="failed to get container status \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": rpc error: code = NotFound desc = could not find container \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": container with ID starting with 4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.768274 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.768997 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": container with ID starting with e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f not found: ID does not exist" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.769022 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} err="failed to get container status \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": rpc error: code = NotFound desc = could not find container \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": container with ID starting with e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.769038 4846 scope.go:117] "RemoveContainer" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.769721 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": container with ID starting with 207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a not found: ID does not exist" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.769787 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} err="failed to get container status \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": rpc error: code = NotFound desc = could not find container \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": container with ID starting with 207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.769834 4846 scope.go:117] "RemoveContainer" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.770598 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": container with ID starting with 81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89 not found: ID does not exist" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.770658 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} err="failed to get container status \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": rpc error: code = NotFound desc = could not find container \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": container with ID starting with 81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.770697 4846 scope.go:117] "RemoveContainer" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.771106 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": container with ID starting with ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79 not found: ID does not exist" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.771136 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} err="failed to get container status \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": rpc error: code = NotFound desc = could not find container \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": container with ID starting with ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.771156 4846 scope.go:117] "RemoveContainer" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.771486 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": container with ID starting with a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4 not found: ID does not exist" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.771531 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} err="failed to get container status \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": rpc error: code = NotFound desc = could not find container \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": container with ID starting with a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.771553 4846 scope.go:117] "RemoveContainer" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.771887 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": container with ID starting with d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262 not found: ID does not exist" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.771913 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} err="failed to get container status \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": rpc error: code = NotFound desc = could not find container \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": container with ID starting with d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.771933 4846 scope.go:117] "RemoveContainer" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.772311 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": container with ID starting with fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8 not found: ID does not exist" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.772354 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} err="failed to get container status \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": rpc error: code = NotFound desc = could not find container \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": container with ID starting with fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.772379 4846 scope.go:117] "RemoveContainer" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.772976 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": container with ID starting with 9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6 not found: ID does not exist" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.773007 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} err="failed to get container status \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": rpc error: code = NotFound desc = could not find container \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": container with ID starting with 9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.773023 4846 scope.go:117] "RemoveContainer" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" Nov 22 09:25:08 crc kubenswrapper[4846]: E1122 09:25:08.773433 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": container with ID starting with 4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49 not found: ID does not exist" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.773463 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} err="failed to get container status \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": rpc error: code = NotFound desc = could not find container \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": container with ID starting with 4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.773482 4846 scope.go:117] "RemoveContainer" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.774278 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} err="failed to get container status \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": rpc error: code = NotFound desc = could not find container \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": container with ID starting with 4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.774328 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.774791 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} err="failed to get container status \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": rpc error: code = NotFound desc = could not find container \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": container with ID starting with e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.774883 4846 scope.go:117] "RemoveContainer" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.775254 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} err="failed to get container status \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": rpc error: code = NotFound desc = could not find container \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": container with ID starting with 207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.775280 4846 scope.go:117] "RemoveContainer" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.775556 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} err="failed to get container status \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": rpc error: code = NotFound desc = could not find container \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": container with ID starting with 81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.775695 4846 scope.go:117] "RemoveContainer" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.776128 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} err="failed to get container status \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": rpc error: code = NotFound desc = could not find container \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": container with ID starting with ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.776152 4846 scope.go:117] "RemoveContainer" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.776729 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} err="failed to get container status \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": rpc error: code = NotFound desc = could not find container \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": container with ID starting with a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.776791 4846 scope.go:117] "RemoveContainer" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.777144 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} err="failed to get container status \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": rpc error: code = NotFound desc = could not find container \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": container with ID starting with d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.777176 4846 scope.go:117] "RemoveContainer" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.777498 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} err="failed to get container status \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": rpc error: code = NotFound desc = could not find container \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": container with ID starting with fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.777521 4846 scope.go:117] "RemoveContainer" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.777797 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} err="failed to get container status \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": rpc error: code = NotFound desc = could not find container \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": container with ID starting with 9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.777824 4846 scope.go:117] "RemoveContainer" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.778425 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} err="failed to get container status \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": rpc error: code = NotFound desc = could not find container \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": container with ID starting with 4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.778447 4846 scope.go:117] "RemoveContainer" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.778796 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} err="failed to get container status \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": rpc error: code = NotFound desc = could not find container \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": container with ID starting with 4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.778813 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.779166 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} err="failed to get container status \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": rpc error: code = NotFound desc = could not find container \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": container with ID starting with e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.779185 4846 scope.go:117] "RemoveContainer" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.779528 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} err="failed to get container status \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": rpc error: code = NotFound desc = could not find container \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": container with ID starting with 207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.779546 4846 scope.go:117] "RemoveContainer" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.779825 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} err="failed to get container status \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": rpc error: code = NotFound desc = could not find container \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": container with ID starting with 81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.779846 4846 scope.go:117] "RemoveContainer" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.780181 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} err="failed to get container status \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": rpc error: code = NotFound desc = could not find container \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": container with ID starting with ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.780209 4846 scope.go:117] "RemoveContainer" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.780548 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} err="failed to get container status \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": rpc error: code = NotFound desc = could not find container \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": container with ID starting with a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.780573 4846 scope.go:117] "RemoveContainer" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.781330 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} err="failed to get container status \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": rpc error: code = NotFound desc = could not find container \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": container with ID starting with d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.781351 4846 scope.go:117] "RemoveContainer" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.781668 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} err="failed to get container status \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": rpc error: code = NotFound desc = could not find container \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": container with ID starting with fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.781688 4846 scope.go:117] "RemoveContainer" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.782005 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} err="failed to get container status \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": rpc error: code = NotFound desc = could not find container \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": container with ID starting with 9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.782034 4846 scope.go:117] "RemoveContainer" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.782429 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} err="failed to get container status \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": rpc error: code = NotFound desc = could not find container \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": container with ID starting with 4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.782453 4846 scope.go:117] "RemoveContainer" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.782800 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} err="failed to get container status \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": rpc error: code = NotFound desc = could not find container \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": container with ID starting with 4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.782898 4846 scope.go:117] "RemoveContainer" containerID="e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.783317 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f"} err="failed to get container status \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": rpc error: code = NotFound desc = could not find container \"e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f\": container with ID starting with e834f9df7a8858c2ecf5868d975bc8239a57014d20122c7b5f46e076b9d2b81f not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.783340 4846 scope.go:117] "RemoveContainer" containerID="207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.783624 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a"} err="failed to get container status \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": rpc error: code = NotFound desc = could not find container \"207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a\": container with ID starting with 207d6900a9d7b530642bebdc49fbd9cd6237b5b2fb8d7480ca6801df10f9b83a not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.783728 4846 scope.go:117] "RemoveContainer" containerID="81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.784227 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89"} err="failed to get container status \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": rpc error: code = NotFound desc = could not find container \"81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89\": container with ID starting with 81a88b46b28af0e024ca121b8066858853c0d7341111e8005ab55993aea78a89 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.784276 4846 scope.go:117] "RemoveContainer" containerID="ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.784602 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79"} err="failed to get container status \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": rpc error: code = NotFound desc = could not find container \"ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79\": container with ID starting with ba62205fdb0cbac75e59aeccab2c82b22d6361e4ffcc368df7472abca7656a79 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.784701 4846 scope.go:117] "RemoveContainer" containerID="a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.785094 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4"} err="failed to get container status \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": rpc error: code = NotFound desc = could not find container \"a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4\": container with ID starting with a2d5bc5193ed389a54e44e7a70f444b18cfc7327fc134bf4d0b4ad488fe992f4 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.785127 4846 scope.go:117] "RemoveContainer" containerID="d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.785494 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262"} err="failed to get container status \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": rpc error: code = NotFound desc = could not find container \"d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262\": container with ID starting with d737e21527309f94ca99647ee8f7604151275345560030428e07a9de1d0e6262 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.785518 4846 scope.go:117] "RemoveContainer" containerID="fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.786129 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8"} err="failed to get container status \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": rpc error: code = NotFound desc = could not find container \"fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8\": container with ID starting with fefc657022dca353d2baa2b7affb644da843052ac5fbadb861b28124b41e8ab8 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.786250 4846 scope.go:117] "RemoveContainer" containerID="9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.786537 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6"} err="failed to get container status \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": rpc error: code = NotFound desc = could not find container \"9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6\": container with ID starting with 9fcd1b48b3dbb848bee14dcf5b99edb978660df51246f0815641acd134f559a6 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.786565 4846 scope.go:117] "RemoveContainer" containerID="4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.786812 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49"} err="failed to get container status \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": rpc error: code = NotFound desc = could not find container \"4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49\": container with ID starting with 4f958e0e54928dff67ad7a2b713b19e972e4a1dd381a9c5650d43fb233004a49 not found: ID does not exist" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.786842 4846 scope.go:117] "RemoveContainer" containerID="4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575" Nov 22 09:25:08 crc kubenswrapper[4846]: I1122 09:25:08.787176 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575"} err="failed to get container status \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": rpc error: code = NotFound desc = could not find container \"4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575\": container with ID starting with 4048b87ca9ec049f1b61a99db2b5445571a7fd8708280e5d20c2b29ecf6a2575 not found: ID does not exist" Nov 22 09:25:09 crc kubenswrapper[4846]: I1122 09:25:09.535673 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"ead97e05b35542fa9b375dabe90a1daeb0385d7b2c1c9eebbfd61daa50f64c05"} Nov 22 09:25:09 crc kubenswrapper[4846]: I1122 09:25:09.536069 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"ca74af209f1f11781dd9005051a9c2c37c6a50298ca6c8dc2e788f9b92061e73"} Nov 22 09:25:09 crc kubenswrapper[4846]: I1122 09:25:09.536090 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"d3591bddaa63d6d0536ce93e3d873de2495b8a746cb9cd3a6ef203554a879b40"} Nov 22 09:25:09 crc kubenswrapper[4846]: I1122 09:25:09.536110 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"7b9d9a5ed2105c6568f738ae05e577c6d6ac27f76d35a2f76e8436c8255d45ff"} Nov 22 09:25:09 crc kubenswrapper[4846]: I1122 09:25:09.536123 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"615b022478cfa399a7096bed66eecebdac65adf378807923fcf9ef0c5839d7c6"} Nov 22 09:25:09 crc kubenswrapper[4846]: I1122 09:25:09.536137 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"24d89d48deb4be581063efba0dd16d7cb7180950234cbc0f7735e9ab8b18bce2"} Nov 22 09:25:10 crc kubenswrapper[4846]: I1122 09:25:10.043259 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c874da16-5eda-477e-bbd5-e5c105dc7a07" path="/var/lib/kubelet/pods/c874da16-5eda-477e-bbd5-e5c105dc7a07/volumes" Nov 22 09:25:11 crc kubenswrapper[4846]: I1122 09:25:11.557665 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"d97fb6f49b7674d67cb0ff05cb172ca49907c3102fd3f0c8acd487a7497cb005"} Nov 22 09:25:14 crc kubenswrapper[4846]: I1122 09:25:14.582245 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" event={"ID":"07a2dc04-672f-4599-80ad-c5499e84d1ef","Type":"ContainerStarted","Data":"97134d54bfe9ff2e2e40ecca18d0b05156aa713633e4cb6d4a9959b5d368fa1a"} Nov 22 09:25:14 crc kubenswrapper[4846]: I1122 09:25:14.583117 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:14 crc kubenswrapper[4846]: I1122 09:25:14.612644 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" podStartSLOduration=7.612624224 podStartE2EDuration="7.612624224s" podCreationTimestamp="2025-11-22 09:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:25:14.610568125 +0000 UTC m=+689.546257784" watchObservedRunningTime="2025-11-22 09:25:14.612624224 +0000 UTC m=+689.548313873" Nov 22 09:25:14 crc kubenswrapper[4846]: I1122 09:25:14.618627 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:15 crc kubenswrapper[4846]: I1122 09:25:15.591153 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:15 crc kubenswrapper[4846]: I1122 09:25:15.591319 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:15 crc kubenswrapper[4846]: I1122 09:25:15.626772 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:23 crc kubenswrapper[4846]: I1122 09:25:23.035685 4846 scope.go:117] "RemoveContainer" containerID="2c9ecafae6b69b17dbedcb7f5d9e0c34ac261a6452f0276112bc86f9662471e7" Nov 22 09:25:23 crc kubenswrapper[4846]: E1122 09:25:23.037183 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hbcs8_openshift-multus(9aec6a38-e6e4-4009-95d2-6a179c7fac04)\"" pod="openshift-multus/multus-hbcs8" podUID="9aec6a38-e6e4-4009-95d2-6a179c7fac04" Nov 22 09:25:37 crc kubenswrapper[4846]: I1122 09:25:37.036517 4846 scope.go:117] "RemoveContainer" containerID="2c9ecafae6b69b17dbedcb7f5d9e0c34ac261a6452f0276112bc86f9662471e7" Nov 22 09:25:37 crc kubenswrapper[4846]: I1122 09:25:37.744589 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/2.log" Nov 22 09:25:37 crc kubenswrapper[4846]: I1122 09:25:37.745621 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/1.log" Nov 22 09:25:37 crc kubenswrapper[4846]: I1122 09:25:37.745691 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hbcs8" event={"ID":"9aec6a38-e6e4-4009-95d2-6a179c7fac04","Type":"ContainerStarted","Data":"7e43254bd97224b5d4d1b072e7a6e3f237299f6ba4ab1f810437fcc6826e8539"} Nov 22 09:25:38 crc kubenswrapper[4846]: I1122 09:25:38.302116 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr7sx" Nov 22 09:25:46 crc kubenswrapper[4846]: I1122 09:25:46.317767 4846 scope.go:117] "RemoveContainer" containerID="8eef37c830e1bf75a25f4cc1337fd06691ab0a86da7ecef260ce53180608034a" Nov 22 09:25:46 crc kubenswrapper[4846]: I1122 09:25:46.811163 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hbcs8_9aec6a38-e6e4-4009-95d2-6a179c7fac04/kube-multus/2.log" Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.737278 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx"] Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.740477 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.748630 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.764947 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx"] Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.915397 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.915510 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:47 crc kubenswrapper[4846]: I1122 09:25:47.915544 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmjr8\" (UniqueName: \"kubernetes.io/projected/f112ec7f-7ff7-4205-a2c9-331d34530c5a-kube-api-access-wmjr8\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.017247 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.017596 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmjr8\" (UniqueName: \"kubernetes.io/projected/f112ec7f-7ff7-4205-a2c9-331d34530c5a-kube-api-access-wmjr8\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.017840 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.018097 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.018528 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.044239 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmjr8\" (UniqueName: \"kubernetes.io/projected/f112ec7f-7ff7-4205-a2c9-331d34530c5a-kube-api-access-wmjr8\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.064727 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.277908 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx"] Nov 22 09:25:48 crc kubenswrapper[4846]: W1122 09:25:48.285899 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf112ec7f_7ff7_4205_a2c9_331d34530c5a.slice/crio-a6fda2c9647409db177060d448870869c663dbf1a2f763657422540d85efc895 WatchSource:0}: Error finding container a6fda2c9647409db177060d448870869c663dbf1a2f763657422540d85efc895: Status 404 returned error can't find the container with id a6fda2c9647409db177060d448870869c663dbf1a2f763657422540d85efc895 Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.825876 4846 generic.go:334] "Generic (PLEG): container finished" podID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerID="a193cb21f966ac5483ad228208607bc2c336b11918762e9441289016983ecfc3" exitCode=0 Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.825934 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" event={"ID":"f112ec7f-7ff7-4205-a2c9-331d34530c5a","Type":"ContainerDied","Data":"a193cb21f966ac5483ad228208607bc2c336b11918762e9441289016983ecfc3"} Nov 22 09:25:48 crc kubenswrapper[4846]: I1122 09:25:48.825973 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" event={"ID":"f112ec7f-7ff7-4205-a2c9-331d34530c5a","Type":"ContainerStarted","Data":"a6fda2c9647409db177060d448870869c663dbf1a2f763657422540d85efc895"} Nov 22 09:25:50 crc kubenswrapper[4846]: I1122 09:25:50.837597 4846 generic.go:334] "Generic (PLEG): container finished" podID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerID="c613fca9ba6ba19b6c813e3b77f540fbba5c64f25ad1a14d60c2421c83ffc324" exitCode=0 Nov 22 09:25:50 crc kubenswrapper[4846]: I1122 09:25:50.837765 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" event={"ID":"f112ec7f-7ff7-4205-a2c9-331d34530c5a","Type":"ContainerDied","Data":"c613fca9ba6ba19b6c813e3b77f540fbba5c64f25ad1a14d60c2421c83ffc324"} Nov 22 09:25:51 crc kubenswrapper[4846]: I1122 09:25:51.849547 4846 generic.go:334] "Generic (PLEG): container finished" podID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerID="537785b856c473a46da6a36bc189eb592a7d834c21ed22c3fa1adbadbd787295" exitCode=0 Nov 22 09:25:51 crc kubenswrapper[4846]: I1122 09:25:51.849676 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" event={"ID":"f112ec7f-7ff7-4205-a2c9-331d34530c5a","Type":"ContainerDied","Data":"537785b856c473a46da6a36bc189eb592a7d834c21ed22c3fa1adbadbd787295"} Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.130423 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.296158 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-util\") pod \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.296234 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmjr8\" (UniqueName: \"kubernetes.io/projected/f112ec7f-7ff7-4205-a2c9-331d34530c5a-kube-api-access-wmjr8\") pod \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.296365 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-bundle\") pod \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\" (UID: \"f112ec7f-7ff7-4205-a2c9-331d34530c5a\") " Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.297839 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-bundle" (OuterVolumeSpecName: "bundle") pod "f112ec7f-7ff7-4205-a2c9-331d34530c5a" (UID: "f112ec7f-7ff7-4205-a2c9-331d34530c5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.309342 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f112ec7f-7ff7-4205-a2c9-331d34530c5a-kube-api-access-wmjr8" (OuterVolumeSpecName: "kube-api-access-wmjr8") pod "f112ec7f-7ff7-4205-a2c9-331d34530c5a" (UID: "f112ec7f-7ff7-4205-a2c9-331d34530c5a"). InnerVolumeSpecName "kube-api-access-wmjr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.315663 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-util" (OuterVolumeSpecName: "util") pod "f112ec7f-7ff7-4205-a2c9-331d34530c5a" (UID: "f112ec7f-7ff7-4205-a2c9-331d34530c5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.397993 4846 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.398096 4846 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f112ec7f-7ff7-4205-a2c9-331d34530c5a-util\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.398117 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmjr8\" (UniqueName: \"kubernetes.io/projected/f112ec7f-7ff7-4205-a2c9-331d34530c5a-kube-api-access-wmjr8\") on node \"crc\" DevicePath \"\"" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.874097 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" event={"ID":"f112ec7f-7ff7-4205-a2c9-331d34530c5a","Type":"ContainerDied","Data":"a6fda2c9647409db177060d448870869c663dbf1a2f763657422540d85efc895"} Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.874166 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6fda2c9647409db177060d448870869c663dbf1a2f763657422540d85efc895" Nov 22 09:25:53 crc kubenswrapper[4846]: I1122 09:25:53.874226 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.390895 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-629pv"] Nov 22 09:25:56 crc kubenswrapper[4846]: E1122 09:25:56.392377 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="pull" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.392432 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="pull" Nov 22 09:25:56 crc kubenswrapper[4846]: E1122 09:25:56.392457 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="util" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.392465 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="util" Nov 22 09:25:56 crc kubenswrapper[4846]: E1122 09:25:56.392481 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="extract" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.392488 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="extract" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.392865 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="f112ec7f-7ff7-4205-a2c9-331d34530c5a" containerName="extract" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.393647 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.402034 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-k848h" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.402217 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.402438 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.415062 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-629pv"] Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.544874 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c2kx\" (UniqueName: \"kubernetes.io/projected/1fecb21a-594d-4e4f-a063-37cbf0e0d5ea-kube-api-access-2c2kx\") pod \"nmstate-operator-557fdffb88-629pv\" (UID: \"1fecb21a-594d-4e4f-a063-37cbf0e0d5ea\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.646885 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c2kx\" (UniqueName: \"kubernetes.io/projected/1fecb21a-594d-4e4f-a063-37cbf0e0d5ea-kube-api-access-2c2kx\") pod \"nmstate-operator-557fdffb88-629pv\" (UID: \"1fecb21a-594d-4e4f-a063-37cbf0e0d5ea\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.666273 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c2kx\" (UniqueName: \"kubernetes.io/projected/1fecb21a-594d-4e4f-a063-37cbf0e0d5ea-kube-api-access-2c2kx\") pod \"nmstate-operator-557fdffb88-629pv\" (UID: \"1fecb21a-594d-4e4f-a063-37cbf0e0d5ea\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.722278 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" Nov 22 09:25:56 crc kubenswrapper[4846]: I1122 09:25:56.917290 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-629pv"] Nov 22 09:25:57 crc kubenswrapper[4846]: I1122 09:25:57.901022 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" event={"ID":"1fecb21a-594d-4e4f-a063-37cbf0e0d5ea","Type":"ContainerStarted","Data":"7fe349b9a75142b16f2ffc75cb02ffddffcb70afb1410999bba3a1d23e0fa213"} Nov 22 09:25:58 crc kubenswrapper[4846]: I1122 09:25:58.625417 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:25:58 crc kubenswrapper[4846]: I1122 09:25:58.626103 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:25:59 crc kubenswrapper[4846]: I1122 09:25:59.913483 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" event={"ID":"1fecb21a-594d-4e4f-a063-37cbf0e0d5ea","Type":"ContainerStarted","Data":"99957671e39d3faebf51fe5b2dfe7f5fffe0878b7e98919ba63dc514b4bfdb09"} Nov 22 09:25:59 crc kubenswrapper[4846]: I1122 09:25:59.936430 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-629pv" podStartSLOduration=1.485429143 podStartE2EDuration="3.936404523s" podCreationTimestamp="2025-11-22 09:25:56 +0000 UTC" firstStartedPulling="2025-11-22 09:25:56.93474674 +0000 UTC m=+731.870436389" lastFinishedPulling="2025-11-22 09:25:59.38572212 +0000 UTC m=+734.321411769" observedRunningTime="2025-11-22 09:25:59.934791966 +0000 UTC m=+734.870481625" watchObservedRunningTime="2025-11-22 09:25:59.936404523 +0000 UTC m=+734.872094172" Nov 22 09:26:00 crc kubenswrapper[4846]: I1122 09:26:00.993269 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk"] Nov 22 09:26:00 crc kubenswrapper[4846]: I1122 09:26:00.995080 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" Nov 22 09:26:00 crc kubenswrapper[4846]: I1122 09:26:00.997464 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-crj68" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.015666 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.023374 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.024319 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.029531 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.035246 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lpsrm"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.036067 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.053476 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.108219 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c6f850af-f692-4fa4-b289-1fd426f79090-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.108312 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plvkw\" (UniqueName: \"kubernetes.io/projected/c6f850af-f692-4fa4-b289-1fd426f79090-kube-api-access-plvkw\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.108371 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcwg4\" (UniqueName: \"kubernetes.io/projected/9173cda0-1bab-4e52-96e3-4e3c564b846f-kube-api-access-bcwg4\") pod \"nmstate-metrics-5dcf9c57c5-bkbgk\" (UID: \"9173cda0-1bab-4e52-96e3-4e3c564b846f\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.156117 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.156911 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.158921 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-g8c48" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.159134 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.159197 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.172287 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.209581 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-nmstate-lock\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.209702 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqrf\" (UniqueName: \"kubernetes.io/projected/dd1e7111-d57d-44c4-bcdb-7045dc626f01-kube-api-access-4wqrf\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.209772 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-dbus-socket\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.209856 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c6f850af-f692-4fa4-b289-1fd426f79090-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.209923 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-ovs-socket\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: E1122 09:26:01.209990 4846 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 22 09:26:01 crc kubenswrapper[4846]: E1122 09:26:01.210094 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6f850af-f692-4fa4-b289-1fd426f79090-tls-key-pair podName:c6f850af-f692-4fa4-b289-1fd426f79090 nodeName:}" failed. No retries permitted until 2025-11-22 09:26:01.71007418 +0000 UTC m=+736.645763829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c6f850af-f692-4fa4-b289-1fd426f79090-tls-key-pair") pod "nmstate-webhook-6b89b748d8-55lsz" (UID: "c6f850af-f692-4fa4-b289-1fd426f79090") : secret "openshift-nmstate-webhook" not found Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.210349 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plvkw\" (UniqueName: \"kubernetes.io/projected/c6f850af-f692-4fa4-b289-1fd426f79090-kube-api-access-plvkw\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.212613 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcwg4\" (UniqueName: \"kubernetes.io/projected/9173cda0-1bab-4e52-96e3-4e3c564b846f-kube-api-access-bcwg4\") pod \"nmstate-metrics-5dcf9c57c5-bkbgk\" (UID: \"9173cda0-1bab-4e52-96e3-4e3c564b846f\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.232083 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plvkw\" (UniqueName: \"kubernetes.io/projected/c6f850af-f692-4fa4-b289-1fd426f79090-kube-api-access-plvkw\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.233011 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcwg4\" (UniqueName: \"kubernetes.io/projected/9173cda0-1bab-4e52-96e3-4e3c564b846f-kube-api-access-bcwg4\") pod \"nmstate-metrics-5dcf9c57c5-bkbgk\" (UID: \"9173cda0-1bab-4e52-96e3-4e3c564b846f\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.315737 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqrf\" (UniqueName: \"kubernetes.io/projected/dd1e7111-d57d-44c4-bcdb-7045dc626f01-kube-api-access-4wqrf\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.315830 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-dbus-socket\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316299 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-dbus-socket\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316445 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316524 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-ovs-socket\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316576 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w4h\" (UniqueName: \"kubernetes.io/projected/ea454b74-e77b-4f90-8311-563ab0e66191-kube-api-access-f4w4h\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316603 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea454b74-e77b-4f90-8311-563ab0e66191-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316644 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea454b74-e77b-4f90-8311-563ab0e66191-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316677 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-nmstate-lock\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316683 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-ovs-socket\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.316741 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dd1e7111-d57d-44c4-bcdb-7045dc626f01-nmstate-lock\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.343042 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqrf\" (UniqueName: \"kubernetes.io/projected/dd1e7111-d57d-44c4-bcdb-7045dc626f01-kube-api-access-4wqrf\") pod \"nmstate-handler-lpsrm\" (UID: \"dd1e7111-d57d-44c4-bcdb-7045dc626f01\") " pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.357838 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.383188 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dffbb58c8-qkllv"] Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.384240 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.402550 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dffbb58c8-qkllv"] Nov 22 09:26:01 crc kubenswrapper[4846]: W1122 09:26:01.412960 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1e7111_d57d_44c4_bcdb_7045dc626f01.slice/crio-ad48e73bde4b282d92d088f0f8afc141ddb46d4b95be1d2e1069e5204ab8dad7 WatchSource:0}: Error finding container ad48e73bde4b282d92d088f0f8afc141ddb46d4b95be1d2e1069e5204ab8dad7: Status 404 returned error can't find the container with id ad48e73bde4b282d92d088f0f8afc141ddb46d4b95be1d2e1069e5204ab8dad7 Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.418076 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w4h\" (UniqueName: \"kubernetes.io/projected/ea454b74-e77b-4f90-8311-563ab0e66191-kube-api-access-f4w4h\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.418121 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea454b74-e77b-4f90-8311-563ab0e66191-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.418164 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea454b74-e77b-4f90-8311-563ab0e66191-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.419202 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ea454b74-e77b-4f90-8311-563ab0e66191-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: E1122 09:26:01.419708 4846 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 22 09:26:01 crc kubenswrapper[4846]: E1122 09:26:01.419761 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea454b74-e77b-4f90-8311-563ab0e66191-plugin-serving-cert podName:ea454b74-e77b-4f90-8311-563ab0e66191 nodeName:}" failed. No retries permitted until 2025-11-22 09:26:01.919744314 +0000 UTC m=+736.855433963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ea454b74-e77b-4f90-8311-563ab0e66191-plugin-serving-cert") pod "nmstate-console-plugin-5874bd7bc5-zplwx" (UID: "ea454b74-e77b-4f90-8311-563ab0e66191") : secret "plugin-serving-cert" not found Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.453352 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w4h\" (UniqueName: \"kubernetes.io/projected/ea454b74-e77b-4f90-8311-563ab0e66191-kube-api-access-f4w4h\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.523927 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-config\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.523994 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-service-ca\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.524079 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-oauth-config\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.524121 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-oauth-serving-cert\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.524185 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-trusted-ca-bundle\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.524210 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmdg\" (UniqueName: \"kubernetes.io/projected/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-kube-api-access-klmdg\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.524259 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-serving-cert\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.573916 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk"] Nov 22 09:26:01 crc kubenswrapper[4846]: W1122 09:26:01.579627 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9173cda0_1bab_4e52_96e3_4e3c564b846f.slice/crio-65207480e13d4bfaf5cf95e4f1aa24aa8c54fb6c8fd31e522337f7511bc38b9d WatchSource:0}: Error finding container 65207480e13d4bfaf5cf95e4f1aa24aa8c54fb6c8fd31e522337f7511bc38b9d: Status 404 returned error can't find the container with id 65207480e13d4bfaf5cf95e4f1aa24aa8c54fb6c8fd31e522337f7511bc38b9d Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625003 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-oauth-config\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625093 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-oauth-serving-cert\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625145 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-trusted-ca-bundle\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625165 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmdg\" (UniqueName: \"kubernetes.io/projected/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-kube-api-access-klmdg\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625220 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-serving-cert\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625251 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-config\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.625272 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-service-ca\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.626315 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-service-ca\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.626573 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-oauth-serving-cert\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.626667 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-config\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.628358 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-trusted-ca-bundle\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.630587 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-serving-cert\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.632456 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-console-oauth-config\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.643234 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmdg\" (UniqueName: \"kubernetes.io/projected/28ef9dea-7da4-4cf6-bf5f-5ced7380cf22-kube-api-access-klmdg\") pod \"console-5dffbb58c8-qkllv\" (UID: \"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22\") " pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.726815 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c6f850af-f692-4fa4-b289-1fd426f79090-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.730687 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c6f850af-f692-4fa4-b289-1fd426f79090-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-55lsz\" (UID: \"c6f850af-f692-4fa4-b289-1fd426f79090\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.742543 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.928369 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lpsrm" event={"ID":"dd1e7111-d57d-44c4-bcdb-7045dc626f01","Type":"ContainerStarted","Data":"ad48e73bde4b282d92d088f0f8afc141ddb46d4b95be1d2e1069e5204ab8dad7"} Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.928901 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea454b74-e77b-4f90-8311-563ab0e66191-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.929140 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" event={"ID":"9173cda0-1bab-4e52-96e3-4e3c564b846f","Type":"ContainerStarted","Data":"65207480e13d4bfaf5cf95e4f1aa24aa8c54fb6c8fd31e522337f7511bc38b9d"} Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.933349 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea454b74-e77b-4f90-8311-563ab0e66191-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-zplwx\" (UID: \"ea454b74-e77b-4f90-8311-563ab0e66191\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.940983 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:01 crc kubenswrapper[4846]: I1122 09:26:01.944579 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dffbb58c8-qkllv"] Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.070906 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.162109 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz"] Nov 22 09:26:02 crc kubenswrapper[4846]: W1122 09:26:02.167590 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6f850af_f692_4fa4_b289_1fd426f79090.slice/crio-44d7ed9f31c53e8c19bc348353382c6de428569c9d7b25447cf08c00720e7ff9 WatchSource:0}: Error finding container 44d7ed9f31c53e8c19bc348353382c6de428569c9d7b25447cf08c00720e7ff9: Status 404 returned error can't find the container with id 44d7ed9f31c53e8c19bc348353382c6de428569c9d7b25447cf08c00720e7ff9 Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.307364 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx"] Nov 22 09:26:02 crc kubenswrapper[4846]: W1122 09:26:02.314265 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea454b74_e77b_4f90_8311_563ab0e66191.slice/crio-b22f614c5345d6b0b4efe72196c5efdb26fd410e21082baf433ebcfd3dd35aa9 WatchSource:0}: Error finding container b22f614c5345d6b0b4efe72196c5efdb26fd410e21082baf433ebcfd3dd35aa9: Status 404 returned error can't find the container with id b22f614c5345d6b0b4efe72196c5efdb26fd410e21082baf433ebcfd3dd35aa9 Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.946683 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dffbb58c8-qkllv" event={"ID":"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22","Type":"ContainerStarted","Data":"dda2d1dc9cd6d1deae2477e2566e59d39b02f7fd67928e3a64c1d73a21fd98b0"} Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.948438 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dffbb58c8-qkllv" event={"ID":"28ef9dea-7da4-4cf6-bf5f-5ced7380cf22","Type":"ContainerStarted","Data":"651835925bc0ffdcd18892e7edf0a4b64ba218bb6aebbbf20a90554c8e9cfe85"} Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.949537 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" event={"ID":"c6f850af-f692-4fa4-b289-1fd426f79090","Type":"ContainerStarted","Data":"44d7ed9f31c53e8c19bc348353382c6de428569c9d7b25447cf08c00720e7ff9"} Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.952642 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" event={"ID":"ea454b74-e77b-4f90-8311-563ab0e66191","Type":"ContainerStarted","Data":"b22f614c5345d6b0b4efe72196c5efdb26fd410e21082baf433ebcfd3dd35aa9"} Nov 22 09:26:02 crc kubenswrapper[4846]: I1122 09:26:02.990106 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dffbb58c8-qkllv" podStartSLOduration=1.9900442630000001 podStartE2EDuration="1.990044263s" podCreationTimestamp="2025-11-22 09:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:26:02.982336968 +0000 UTC m=+737.918026667" watchObservedRunningTime="2025-11-22 09:26:02.990044263 +0000 UTC m=+737.925733942" Nov 22 09:26:05 crc kubenswrapper[4846]: I1122 09:26:05.974645 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" event={"ID":"c6f850af-f692-4fa4-b289-1fd426f79090","Type":"ContainerStarted","Data":"0822d7f3aa3ea320e563bf967b877b68535c5848c45c323791f2aaeec40cd946"} Nov 22 09:26:05 crc kubenswrapper[4846]: I1122 09:26:05.975351 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:05 crc kubenswrapper[4846]: I1122 09:26:05.976685 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" event={"ID":"ea454b74-e77b-4f90-8311-563ab0e66191","Type":"ContainerStarted","Data":"5245065cbeae9348fd7c8851cdcb515fbc2638f5e2757eaba80f0714c937f5fd"} Nov 22 09:26:05 crc kubenswrapper[4846]: I1122 09:26:05.978347 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" event={"ID":"9173cda0-1bab-4e52-96e3-4e3c564b846f","Type":"ContainerStarted","Data":"3dcf05eb34b008fa8a3b85ee9397ee58f3a4d80040121c0b09ac2fd7e80efa3d"} Nov 22 09:26:05 crc kubenswrapper[4846]: I1122 09:26:05.980328 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lpsrm" event={"ID":"dd1e7111-d57d-44c4-bcdb-7045dc626f01","Type":"ContainerStarted","Data":"f0bb53abee900c4b788ce788c955dd2aaaf63d98d5c37cacac39a86b67cdfb10"} Nov 22 09:26:05 crc kubenswrapper[4846]: I1122 09:26:05.980516 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:06 crc kubenswrapper[4846]: I1122 09:26:05.998977 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" podStartSLOduration=3.312197063 podStartE2EDuration="5.998956039s" podCreationTimestamp="2025-11-22 09:26:00 +0000 UTC" firstStartedPulling="2025-11-22 09:26:02.169923583 +0000 UTC m=+737.105613232" lastFinishedPulling="2025-11-22 09:26:04.856682509 +0000 UTC m=+739.792372208" observedRunningTime="2025-11-22 09:26:05.997492096 +0000 UTC m=+740.933181755" watchObservedRunningTime="2025-11-22 09:26:05.998956039 +0000 UTC m=+740.934645698" Nov 22 09:26:06 crc kubenswrapper[4846]: I1122 09:26:06.022412 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-zplwx" podStartSLOduration=2.482356662 podStartE2EDuration="5.022383543s" podCreationTimestamp="2025-11-22 09:26:01 +0000 UTC" firstStartedPulling="2025-11-22 09:26:02.318653316 +0000 UTC m=+737.254342985" lastFinishedPulling="2025-11-22 09:26:04.858680217 +0000 UTC m=+739.794369866" observedRunningTime="2025-11-22 09:26:06.019349944 +0000 UTC m=+740.955039613" watchObservedRunningTime="2025-11-22 09:26:06.022383543 +0000 UTC m=+740.958073232" Nov 22 09:26:06 crc kubenswrapper[4846]: I1122 09:26:06.061525 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lpsrm" podStartSLOduration=1.6382711300000001 podStartE2EDuration="5.061495045s" podCreationTimestamp="2025-11-22 09:26:01 +0000 UTC" firstStartedPulling="2025-11-22 09:26:01.43366802 +0000 UTC m=+736.369357669" lastFinishedPulling="2025-11-22 09:26:04.856891935 +0000 UTC m=+739.792581584" observedRunningTime="2025-11-22 09:26:06.046506147 +0000 UTC m=+740.982195816" watchObservedRunningTime="2025-11-22 09:26:06.061495045 +0000 UTC m=+740.997184694" Nov 22 09:26:08 crc kubenswrapper[4846]: I1122 09:26:08.002142 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" event={"ID":"9173cda0-1bab-4e52-96e3-4e3c564b846f","Type":"ContainerStarted","Data":"7acebd14d12e7d4d43236fc0b868ab0ac0959e31bc8c48596abbddbbc631bcb5"} Nov 22 09:26:08 crc kubenswrapper[4846]: I1122 09:26:08.021748 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-bkbgk" podStartSLOduration=2.285877969 podStartE2EDuration="8.021710252s" podCreationTimestamp="2025-11-22 09:26:00 +0000 UTC" firstStartedPulling="2025-11-22 09:26:01.582625231 +0000 UTC m=+736.518314880" lastFinishedPulling="2025-11-22 09:26:07.318457514 +0000 UTC m=+742.254147163" observedRunningTime="2025-11-22 09:26:08.017957293 +0000 UTC m=+742.953647012" watchObservedRunningTime="2025-11-22 09:26:08.021710252 +0000 UTC m=+742.957399941" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.120545 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xxpch"] Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.121278 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" podUID="fd5e681f-ca95-4ba0-935e-86f18702cf78" containerName="controller-manager" containerID="cri-o://06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e" gracePeriod=30 Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.215321 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx"] Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.215609 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" podUID="c9204723-54c5-457c-8bb8-58be85f199e2" containerName="route-controller-manager" containerID="cri-o://2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b" gracePeriod=30 Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.416107 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lpsrm" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.583429 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.595742 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-proxy-ca-bundles\") pod \"fd5e681f-ca95-4ba0-935e-86f18702cf78\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.595789 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5e681f-ca95-4ba0-935e-86f18702cf78-serving-cert\") pod \"fd5e681f-ca95-4ba0-935e-86f18702cf78\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.595981 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sddbs\" (UniqueName: \"kubernetes.io/projected/fd5e681f-ca95-4ba0-935e-86f18702cf78-kube-api-access-sddbs\") pod \"fd5e681f-ca95-4ba0-935e-86f18702cf78\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.596018 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-config\") pod \"fd5e681f-ca95-4ba0-935e-86f18702cf78\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.596093 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-client-ca\") pod \"fd5e681f-ca95-4ba0-935e-86f18702cf78\" (UID: \"fd5e681f-ca95-4ba0-935e-86f18702cf78\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.596868 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fd5e681f-ca95-4ba0-935e-86f18702cf78" (UID: "fd5e681f-ca95-4ba0-935e-86f18702cf78"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.597522 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-client-ca" (OuterVolumeSpecName: "client-ca") pod "fd5e681f-ca95-4ba0-935e-86f18702cf78" (UID: "fd5e681f-ca95-4ba0-935e-86f18702cf78"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.597580 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-config" (OuterVolumeSpecName: "config") pod "fd5e681f-ca95-4ba0-935e-86f18702cf78" (UID: "fd5e681f-ca95-4ba0-935e-86f18702cf78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.606971 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5e681f-ca95-4ba0-935e-86f18702cf78-kube-api-access-sddbs" (OuterVolumeSpecName: "kube-api-access-sddbs") pod "fd5e681f-ca95-4ba0-935e-86f18702cf78" (UID: "fd5e681f-ca95-4ba0-935e-86f18702cf78"). InnerVolumeSpecName "kube-api-access-sddbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.609570 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5e681f-ca95-4ba0-935e-86f18702cf78-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fd5e681f-ca95-4ba0-935e-86f18702cf78" (UID: "fd5e681f-ca95-4ba0-935e-86f18702cf78"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.691816 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.697497 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9204723-54c5-457c-8bb8-58be85f199e2-serving-cert\") pod \"c9204723-54c5-457c-8bb8-58be85f199e2\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.697559 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-config\") pod \"c9204723-54c5-457c-8bb8-58be85f199e2\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.697611 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-client-ca\") pod \"c9204723-54c5-457c-8bb8-58be85f199e2\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.697670 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8mlh\" (UniqueName: \"kubernetes.io/projected/c9204723-54c5-457c-8bb8-58be85f199e2-kube-api-access-f8mlh\") pod \"c9204723-54c5-457c-8bb8-58be85f199e2\" (UID: \"c9204723-54c5-457c-8bb8-58be85f199e2\") " Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698010 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sddbs\" (UniqueName: \"kubernetes.io/projected/fd5e681f-ca95-4ba0-935e-86f18702cf78-kube-api-access-sddbs\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698028 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698039 4846 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698067 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5e681f-ca95-4ba0-935e-86f18702cf78-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698078 4846 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd5e681f-ca95-4ba0-935e-86f18702cf78-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698929 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9204723-54c5-457c-8bb8-58be85f199e2" (UID: "c9204723-54c5-457c-8bb8-58be85f199e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.698943 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-config" (OuterVolumeSpecName: "config") pod "c9204723-54c5-457c-8bb8-58be85f199e2" (UID: "c9204723-54c5-457c-8bb8-58be85f199e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.702279 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9204723-54c5-457c-8bb8-58be85f199e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9204723-54c5-457c-8bb8-58be85f199e2" (UID: "c9204723-54c5-457c-8bb8-58be85f199e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.702423 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9204723-54c5-457c-8bb8-58be85f199e2-kube-api-access-f8mlh" (OuterVolumeSpecName: "kube-api-access-f8mlh") pod "c9204723-54c5-457c-8bb8-58be85f199e2" (UID: "c9204723-54c5-457c-8bb8-58be85f199e2"). InnerVolumeSpecName "kube-api-access-f8mlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.743632 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.743721 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.748868 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.798556 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8mlh\" (UniqueName: \"kubernetes.io/projected/c9204723-54c5-457c-8bb8-58be85f199e2-kube-api-access-f8mlh\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.798619 4846 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9204723-54c5-457c-8bb8-58be85f199e2-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.798634 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:11 crc kubenswrapper[4846]: I1122 09:26:11.798648 4846 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9204723-54c5-457c-8bb8-58be85f199e2-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.030961 4846 generic.go:334] "Generic (PLEG): container finished" podID="fd5e681f-ca95-4ba0-935e-86f18702cf78" containerID="06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e" exitCode=0 Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.031117 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" event={"ID":"fd5e681f-ca95-4ba0-935e-86f18702cf78","Type":"ContainerDied","Data":"06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e"} Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.031161 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" event={"ID":"fd5e681f-ca95-4ba0-935e-86f18702cf78","Type":"ContainerDied","Data":"c9f1302d27bd3065b41bd8f65284614f65d6e7e2d758e286abd551453082cae2"} Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.031188 4846 scope.go:117] "RemoveContainer" containerID="06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.031373 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xxpch" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.036298 4846 generic.go:334] "Generic (PLEG): container finished" podID="c9204723-54c5-457c-8bb8-58be85f199e2" containerID="2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b" exitCode=0 Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.036992 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.042948 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dffbb58c8-qkllv" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.042986 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" event={"ID":"c9204723-54c5-457c-8bb8-58be85f199e2","Type":"ContainerDied","Data":"2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b"} Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.043008 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx" event={"ID":"c9204723-54c5-457c-8bb8-58be85f199e2","Type":"ContainerDied","Data":"59e455ede513eb86dd2a13dbee135765f4d25fa89de7bbd2221f7d8b27c0edd8"} Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.060299 4846 scope.go:117] "RemoveContainer" containerID="06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e" Nov 22 09:26:12 crc kubenswrapper[4846]: E1122 09:26:12.060881 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e\": container with ID starting with 06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e not found: ID does not exist" containerID="06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.060918 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e"} err="failed to get container status \"06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e\": rpc error: code = NotFound desc = could not find container \"06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e\": container with ID starting with 06163bfafc2d22184867c18e0e62661ea8d935822a94bfd1656a89de41d61f5e not found: ID does not exist" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.060948 4846 scope.go:117] "RemoveContainer" containerID="2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.085660 4846 scope.go:117] "RemoveContainer" containerID="2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b" Nov 22 09:26:12 crc kubenswrapper[4846]: E1122 09:26:12.095369 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b\": container with ID starting with 2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b not found: ID does not exist" containerID="2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.095424 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b"} err="failed to get container status \"2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b\": rpc error: code = NotFound desc = could not find container \"2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b\": container with ID starting with 2d6dc4d48756daaf8a506ea015f03716471ca6dfbfbe2124792fc846c1330f6b not found: ID does not exist" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.098910 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx"] Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.112693 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vmbgx"] Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.119276 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k86mj"] Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.125109 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xxpch"] Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.128270 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xxpch"] Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.362203 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt"] Nov 22 09:26:12 crc kubenswrapper[4846]: E1122 09:26:12.362875 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5e681f-ca95-4ba0-935e-86f18702cf78" containerName="controller-manager" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.362971 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5e681f-ca95-4ba0-935e-86f18702cf78" containerName="controller-manager" Nov 22 09:26:12 crc kubenswrapper[4846]: E1122 09:26:12.363083 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9204723-54c5-457c-8bb8-58be85f199e2" containerName="route-controller-manager" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.363201 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9204723-54c5-457c-8bb8-58be85f199e2" containerName="route-controller-manager" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.363414 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5e681f-ca95-4ba0-935e-86f18702cf78" containerName="controller-manager" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.363498 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9204723-54c5-457c-8bb8-58be85f199e2" containerName="route-controller-manager" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.364173 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.368777 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.368832 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.369235 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.369240 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.369294 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.369853 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.378122 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.386756 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt"] Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.510392 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-proxy-ca-bundles\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.510454 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjrls\" (UniqueName: \"kubernetes.io/projected/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-kube-api-access-sjrls\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.511119 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-config\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.511229 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-client-ca\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.511397 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-serving-cert\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.612507 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-proxy-ca-bundles\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.612572 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjrls\" (UniqueName: \"kubernetes.io/projected/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-kube-api-access-sjrls\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.612636 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-config\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.612662 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-client-ca\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.612700 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-serving-cert\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.613794 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-client-ca\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.614220 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-config\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.614258 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-proxy-ca-bundles\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.617746 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-serving-cert\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.635258 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjrls\" (UniqueName: \"kubernetes.io/projected/595aa97d-6c93-4d0a-8cd9-1ec6466ec418-kube-api-access-sjrls\") pod \"controller-manager-7499fbdbd5-mjxkt\" (UID: \"595aa97d-6c93-4d0a-8cd9-1ec6466ec418\") " pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:12 crc kubenswrapper[4846]: I1122 09:26:12.682363 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.080325 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt"] Nov 22 09:26:13 crc kubenswrapper[4846]: W1122 09:26:13.085088 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod595aa97d_6c93_4d0a_8cd9_1ec6466ec418.slice/crio-a85bf0dfafba0548d4dad00c868c176cab4849dcc64ff4348b9723c55d95f478 WatchSource:0}: Error finding container a85bf0dfafba0548d4dad00c868c176cab4849dcc64ff4348b9723c55d95f478: Status 404 returned error can't find the container with id a85bf0dfafba0548d4dad00c868c176cab4849dcc64ff4348b9723c55d95f478 Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.358852 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz"] Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.360292 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.362728 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.363249 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.363943 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.364022 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.366083 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.368145 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.370331 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz"] Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.527187 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec12556a-b6f5-4fbe-b27d-e6253cd27520-client-ca\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.527271 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec12556a-b6f5-4fbe-b27d-e6253cd27520-config\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.527296 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gslff\" (UniqueName: \"kubernetes.io/projected/ec12556a-b6f5-4fbe-b27d-e6253cd27520-kube-api-access-gslff\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.527329 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec12556a-b6f5-4fbe-b27d-e6253cd27520-serving-cert\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.629504 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec12556a-b6f5-4fbe-b27d-e6253cd27520-config\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.629580 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gslff\" (UniqueName: \"kubernetes.io/projected/ec12556a-b6f5-4fbe-b27d-e6253cd27520-kube-api-access-gslff\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.629666 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec12556a-b6f5-4fbe-b27d-e6253cd27520-serving-cert\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.629736 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec12556a-b6f5-4fbe-b27d-e6253cd27520-client-ca\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.630897 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec12556a-b6f5-4fbe-b27d-e6253cd27520-client-ca\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.632934 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec12556a-b6f5-4fbe-b27d-e6253cd27520-config\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.641753 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec12556a-b6f5-4fbe-b27d-e6253cd27520-serving-cert\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.659976 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gslff\" (UniqueName: \"kubernetes.io/projected/ec12556a-b6f5-4fbe-b27d-e6253cd27520-kube-api-access-gslff\") pod \"route-controller-manager-55489b945b-clmvz\" (UID: \"ec12556a-b6f5-4fbe-b27d-e6253cd27520\") " pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.676515 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:13 crc kubenswrapper[4846]: I1122 09:26:13.983936 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz"] Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.055908 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9204723-54c5-457c-8bb8-58be85f199e2" path="/var/lib/kubelet/pods/c9204723-54c5-457c-8bb8-58be85f199e2/volumes" Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.056714 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5e681f-ca95-4ba0-935e-86f18702cf78" path="/var/lib/kubelet/pods/fd5e681f-ca95-4ba0-935e-86f18702cf78/volumes" Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.062896 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" event={"ID":"ec12556a-b6f5-4fbe-b27d-e6253cd27520","Type":"ContainerStarted","Data":"88ee5d2a306386d1e3d5fd7bd57dce7f18e6256e59e90acc620c43e75359f1ac"} Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.069664 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" event={"ID":"595aa97d-6c93-4d0a-8cd9-1ec6466ec418","Type":"ContainerStarted","Data":"b5c83db8bfc03b490113ff6a0f1a62496495a723368d3d34186adf6256f27a4a"} Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.070332 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" event={"ID":"595aa97d-6c93-4d0a-8cd9-1ec6466ec418","Type":"ContainerStarted","Data":"a85bf0dfafba0548d4dad00c868c176cab4849dcc64ff4348b9723c55d95f478"} Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.070364 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.075076 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" Nov 22 09:26:14 crc kubenswrapper[4846]: I1122 09:26:14.091814 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7499fbdbd5-mjxkt" podStartSLOduration=3.091787127 podStartE2EDuration="3.091787127s" podCreationTimestamp="2025-11-22 09:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:26:14.088415628 +0000 UTC m=+749.024105277" watchObservedRunningTime="2025-11-22 09:26:14.091787127 +0000 UTC m=+749.027476776" Nov 22 09:26:15 crc kubenswrapper[4846]: I1122 09:26:15.081620 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" event={"ID":"ec12556a-b6f5-4fbe-b27d-e6253cd27520","Type":"ContainerStarted","Data":"c2218d3dd8406ef790ddcbfed42212b25dad547dce957ef97382bc1916ed8660"} Nov 22 09:26:15 crc kubenswrapper[4846]: I1122 09:26:15.082564 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:15 crc kubenswrapper[4846]: I1122 09:26:15.091887 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" Nov 22 09:26:15 crc kubenswrapper[4846]: I1122 09:26:15.109272 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55489b945b-clmvz" podStartSLOduration=4.109244261 podStartE2EDuration="4.109244261s" podCreationTimestamp="2025-11-22 09:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:26:15.104916055 +0000 UTC m=+750.040605744" watchObservedRunningTime="2025-11-22 09:26:15.109244261 +0000 UTC m=+750.044933910" Nov 22 09:26:20 crc kubenswrapper[4846]: I1122 09:26:20.401869 4846 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 09:26:21 crc kubenswrapper[4846]: I1122 09:26:21.948726 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-55lsz" Nov 22 09:26:28 crc kubenswrapper[4846]: I1122 09:26:28.626096 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:26:28 crc kubenswrapper[4846]: I1122 09:26:28.627080 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:26:35 crc kubenswrapper[4846]: I1122 09:26:35.937529 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w"] Nov 22 09:26:35 crc kubenswrapper[4846]: I1122 09:26:35.941197 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:35 crc kubenswrapper[4846]: I1122 09:26:35.943242 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w"] Nov 22 09:26:35 crc kubenswrapper[4846]: I1122 09:26:35.943956 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.068091 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.068627 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqw2\" (UniqueName: \"kubernetes.io/projected/3743cbee-9a49-40c8-bdae-7913ec94b4d1-kube-api-access-5wqw2\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.068729 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.170183 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.170250 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.170320 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqw2\" (UniqueName: \"kubernetes.io/projected/3743cbee-9a49-40c8-bdae-7913ec94b4d1-kube-api-access-5wqw2\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.170930 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.171142 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.191079 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqw2\" (UniqueName: \"kubernetes.io/projected/3743cbee-9a49-40c8-bdae-7913ec94b4d1-kube-api-access-5wqw2\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.262606 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:36 crc kubenswrapper[4846]: I1122 09:26:36.697305 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w"] Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.169146 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-k86mj" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerName="console" containerID="cri-o://e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637" gracePeriod=15 Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.265875 4846 generic.go:334] "Generic (PLEG): container finished" podID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerID="1aca7641f04ca0139bacde7d9aaa9dc02a1e6bede50dedfb801157b32d8f6c37" exitCode=0 Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.265962 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" event={"ID":"3743cbee-9a49-40c8-bdae-7913ec94b4d1","Type":"ContainerDied","Data":"1aca7641f04ca0139bacde7d9aaa9dc02a1e6bede50dedfb801157b32d8f6c37"} Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.266032 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" event={"ID":"3743cbee-9a49-40c8-bdae-7913ec94b4d1","Type":"ContainerStarted","Data":"749803cb3ce54907ef401bf569ba5de451bcba6e7072046aed82f7cf2dea4eaa"} Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.654918 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k86mj_f23592b0-b045-4aa5-a22f-c15133890ed4/console/0.log" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.655302 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.794504 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-oauth-config\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.794612 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrmh9\" (UniqueName: \"kubernetes.io/projected/f23592b0-b045-4aa5-a22f-c15133890ed4-kube-api-access-qrmh9\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.794705 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-trusted-ca-bundle\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.794788 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-console-config\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.794865 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-serving-cert\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.794933 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-service-ca\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.795029 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-oauth-serving-cert\") pod \"f23592b0-b045-4aa5-a22f-c15133890ed4\" (UID: \"f23592b0-b045-4aa5-a22f-c15133890ed4\") " Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.796713 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-service-ca" (OuterVolumeSpecName: "service-ca") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.796773 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-console-config" (OuterVolumeSpecName: "console-config") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.796735 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.796868 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.803550 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.804488 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.806590 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23592b0-b045-4aa5-a22f-c15133890ed4-kube-api-access-qrmh9" (OuterVolumeSpecName: "kube-api-access-qrmh9") pod "f23592b0-b045-4aa5-a22f-c15133890ed4" (UID: "f23592b0-b045-4aa5-a22f-c15133890ed4"). InnerVolumeSpecName "kube-api-access-qrmh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.896839 4846 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.896916 4846 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.896939 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrmh9\" (UniqueName: \"kubernetes.io/projected/f23592b0-b045-4aa5-a22f-c15133890ed4-kube-api-access-qrmh9\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.896961 4846 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.896980 4846 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.896999 4846 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f23592b0-b045-4aa5-a22f-c15133890ed4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:37 crc kubenswrapper[4846]: I1122 09:26:37.897016 4846 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f23592b0-b045-4aa5-a22f-c15133890ed4-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.274257 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-99vqt"] Nov 22 09:26:38 crc kubenswrapper[4846]: E1122 09:26:38.274557 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerName="console" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.274571 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerName="console" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.274696 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerName="console" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.276557 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.282282 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k86mj_f23592b0-b045-4aa5-a22f-c15133890ed4/console/0.log" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.282874 4846 generic.go:334] "Generic (PLEG): container finished" podID="f23592b0-b045-4aa5-a22f-c15133890ed4" containerID="e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637" exitCode=2 Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.282932 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k86mj" event={"ID":"f23592b0-b045-4aa5-a22f-c15133890ed4","Type":"ContainerDied","Data":"e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637"} Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.282981 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k86mj" event={"ID":"f23592b0-b045-4aa5-a22f-c15133890ed4","Type":"ContainerDied","Data":"7372051fc3a553594ec77d5e8fbd2c9f895cc2418a8f4503d46c06d912f9ce60"} Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.283004 4846 scope.go:117] "RemoveContainer" containerID="e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.284514 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k86mj" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.291434 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99vqt"] Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.306175 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-catalog-content\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.306215 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-utilities\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.306272 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lls\" (UniqueName: \"kubernetes.io/projected/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-kube-api-access-m2lls\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.314341 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k86mj"] Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.329917 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-k86mj"] Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.408064 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-catalog-content\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.408124 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-utilities\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.408190 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lls\" (UniqueName: \"kubernetes.io/projected/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-kube-api-access-m2lls\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.408870 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-catalog-content\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.409327 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-utilities\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.410209 4846 scope.go:117] "RemoveContainer" containerID="e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637" Nov 22 09:26:38 crc kubenswrapper[4846]: E1122 09:26:38.410764 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637\": container with ID starting with e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637 not found: ID does not exist" containerID="e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.410815 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637"} err="failed to get container status \"e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637\": rpc error: code = NotFound desc = could not find container \"e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637\": container with ID starting with e2662b222970ef54240db475f2aa3a4329034fdefe04006cd338d131b5efd637 not found: ID does not exist" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.429462 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lls\" (UniqueName: \"kubernetes.io/projected/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-kube-api-access-m2lls\") pod \"redhat-operators-99vqt\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:38 crc kubenswrapper[4846]: I1122 09:26:38.598764 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:39 crc kubenswrapper[4846]: I1122 09:26:39.037481 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99vqt"] Nov 22 09:26:39 crc kubenswrapper[4846]: I1122 09:26:39.294267 4846 generic.go:334] "Generic (PLEG): container finished" podID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerID="332bd11529d439dc2379b5d220bde54a57428a9c6f16f7ddf9e002f0e0beac87" exitCode=0 Nov 22 09:26:39 crc kubenswrapper[4846]: I1122 09:26:39.294386 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" event={"ID":"3743cbee-9a49-40c8-bdae-7913ec94b4d1","Type":"ContainerDied","Data":"332bd11529d439dc2379b5d220bde54a57428a9c6f16f7ddf9e002f0e0beac87"} Nov 22 09:26:39 crc kubenswrapper[4846]: I1122 09:26:39.296540 4846 generic.go:334] "Generic (PLEG): container finished" podID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerID="e72b9dd985087c95448080b5d6c149c46ad6f0d2cfe49d92125df692e33df836" exitCode=0 Nov 22 09:26:39 crc kubenswrapper[4846]: I1122 09:26:39.296584 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerDied","Data":"e72b9dd985087c95448080b5d6c149c46ad6f0d2cfe49d92125df692e33df836"} Nov 22 09:26:39 crc kubenswrapper[4846]: I1122 09:26:39.296629 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerStarted","Data":"9843c90a034a66a7c4e059a8e87e51885ec8f13e7e595a4cba223b4422b6d348"} Nov 22 09:26:40 crc kubenswrapper[4846]: I1122 09:26:40.044464 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23592b0-b045-4aa5-a22f-c15133890ed4" path="/var/lib/kubelet/pods/f23592b0-b045-4aa5-a22f-c15133890ed4/volumes" Nov 22 09:26:40 crc kubenswrapper[4846]: I1122 09:26:40.307368 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerStarted","Data":"149e21900294b3e3b026d44c7abe53723a541f3362b1612331e73a182737b796"} Nov 22 09:26:40 crc kubenswrapper[4846]: I1122 09:26:40.311988 4846 generic.go:334] "Generic (PLEG): container finished" podID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerID="05c9173543ed156b3eeddaa533149dc586fee5244b6242f895df76abb55a7de1" exitCode=0 Nov 22 09:26:40 crc kubenswrapper[4846]: I1122 09:26:40.312076 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" event={"ID":"3743cbee-9a49-40c8-bdae-7913ec94b4d1","Type":"ContainerDied","Data":"05c9173543ed156b3eeddaa533149dc586fee5244b6242f895df76abb55a7de1"} Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.321117 4846 generic.go:334] "Generic (PLEG): container finished" podID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerID="149e21900294b3e3b026d44c7abe53723a541f3362b1612331e73a182737b796" exitCode=0 Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.321190 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerDied","Data":"149e21900294b3e3b026d44c7abe53723a541f3362b1612331e73a182737b796"} Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.715185 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.862440 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqw2\" (UniqueName: \"kubernetes.io/projected/3743cbee-9a49-40c8-bdae-7913ec94b4d1-kube-api-access-5wqw2\") pod \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.862540 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-bundle\") pod \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.862700 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-util\") pod \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\" (UID: \"3743cbee-9a49-40c8-bdae-7913ec94b4d1\") " Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.863708 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-bundle" (OuterVolumeSpecName: "bundle") pod "3743cbee-9a49-40c8-bdae-7913ec94b4d1" (UID: "3743cbee-9a49-40c8-bdae-7913ec94b4d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.869886 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3743cbee-9a49-40c8-bdae-7913ec94b4d1-kube-api-access-5wqw2" (OuterVolumeSpecName: "kube-api-access-5wqw2") pod "3743cbee-9a49-40c8-bdae-7913ec94b4d1" (UID: "3743cbee-9a49-40c8-bdae-7913ec94b4d1"). InnerVolumeSpecName "kube-api-access-5wqw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.877755 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-util" (OuterVolumeSpecName: "util") pod "3743cbee-9a49-40c8-bdae-7913ec94b4d1" (UID: "3743cbee-9a49-40c8-bdae-7913ec94b4d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.964947 4846 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.964996 4846 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3743cbee-9a49-40c8-bdae-7913ec94b4d1-util\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:41 crc kubenswrapper[4846]: I1122 09:26:41.965011 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqw2\" (UniqueName: \"kubernetes.io/projected/3743cbee-9a49-40c8-bdae-7913ec94b4d1-kube-api-access-5wqw2\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:42 crc kubenswrapper[4846]: I1122 09:26:42.331109 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerStarted","Data":"544d578048e4f1eb9148ccb4174b0e98f61dcd8d4564d0e95b6f12613d4ac887"} Nov 22 09:26:42 crc kubenswrapper[4846]: I1122 09:26:42.334555 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" event={"ID":"3743cbee-9a49-40c8-bdae-7913ec94b4d1","Type":"ContainerDied","Data":"749803cb3ce54907ef401bf569ba5de451bcba6e7072046aed82f7cf2dea4eaa"} Nov 22 09:26:42 crc kubenswrapper[4846]: I1122 09:26:42.334607 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="749803cb3ce54907ef401bf569ba5de451bcba6e7072046aed82f7cf2dea4eaa" Nov 22 09:26:42 crc kubenswrapper[4846]: I1122 09:26:42.334682 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w" Nov 22 09:26:42 crc kubenswrapper[4846]: I1122 09:26:42.360601 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-99vqt" podStartSLOduration=1.861877663 podStartE2EDuration="4.360577928s" podCreationTimestamp="2025-11-22 09:26:38 +0000 UTC" firstStartedPulling="2025-11-22 09:26:39.297958385 +0000 UTC m=+774.233648034" lastFinishedPulling="2025-11-22 09:26:41.79665865 +0000 UTC m=+776.732348299" observedRunningTime="2025-11-22 09:26:42.355738627 +0000 UTC m=+777.291428286" watchObservedRunningTime="2025-11-22 09:26:42.360577928 +0000 UTC m=+777.296267577" Nov 22 09:26:48 crc kubenswrapper[4846]: I1122 09:26:48.599948 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:48 crc kubenswrapper[4846]: I1122 09:26:48.600830 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:49 crc kubenswrapper[4846]: I1122 09:26:49.641707 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-99vqt" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="registry-server" probeResult="failure" output=< Nov 22 09:26:49 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:26:49 crc kubenswrapper[4846]: > Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.266191 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq"] Nov 22 09:26:51 crc kubenswrapper[4846]: E1122 09:26:51.267013 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="util" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.267037 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="util" Nov 22 09:26:51 crc kubenswrapper[4846]: E1122 09:26:51.267079 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="pull" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.267089 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="pull" Nov 22 09:26:51 crc kubenswrapper[4846]: E1122 09:26:51.267098 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="extract" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.267107 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="extract" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.267250 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3743cbee-9a49-40c8-bdae-7913ec94b4d1" containerName="extract" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.267878 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.275163 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.275163 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.275665 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.275762 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.276753 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fnp27" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.342881 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq"] Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.422963 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9a28c92-48ff-4026-819b-70068881c12b-apiservice-cert\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.423462 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9a28c92-48ff-4026-819b-70068881c12b-webhook-cert\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.423559 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk9mb\" (UniqueName: \"kubernetes.io/projected/a9a28c92-48ff-4026-819b-70068881c12b-kube-api-access-rk9mb\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.528302 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9a28c92-48ff-4026-819b-70068881c12b-apiservice-cert\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.528402 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9a28c92-48ff-4026-819b-70068881c12b-webhook-cert\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.528426 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk9mb\" (UniqueName: \"kubernetes.io/projected/a9a28c92-48ff-4026-819b-70068881c12b-kube-api-access-rk9mb\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.531138 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485"] Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.532085 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.536751 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9w7c8" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.536787 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.536831 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.537994 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9a28c92-48ff-4026-819b-70068881c12b-webhook-cert\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.539002 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9a28c92-48ff-4026-819b-70068881c12b-apiservice-cert\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.558897 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk9mb\" (UniqueName: \"kubernetes.io/projected/a9a28c92-48ff-4026-819b-70068881c12b-kube-api-access-rk9mb\") pod \"metallb-operator-controller-manager-6b9465489d-lwlfq\" (UID: \"a9a28c92-48ff-4026-819b-70068881c12b\") " pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.585146 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.603647 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485"] Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.630018 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c8911b3-3f77-4666-822f-e40c1100c67f-webhook-cert\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.630096 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c8911b3-3f77-4666-822f-e40c1100c67f-apiservice-cert\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.630144 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kb6\" (UniqueName: \"kubernetes.io/projected/2c8911b3-3f77-4666-822f-e40c1100c67f-kube-api-access-w9kb6\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.731486 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c8911b3-3f77-4666-822f-e40c1100c67f-apiservice-cert\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.731910 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kb6\" (UniqueName: \"kubernetes.io/projected/2c8911b3-3f77-4666-822f-e40c1100c67f-kube-api-access-w9kb6\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.731966 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c8911b3-3f77-4666-822f-e40c1100c67f-webhook-cert\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.746518 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c8911b3-3f77-4666-822f-e40c1100c67f-apiservice-cert\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.747664 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c8911b3-3f77-4666-822f-e40c1100c67f-webhook-cert\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.753969 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kb6\" (UniqueName: \"kubernetes.io/projected/2c8911b3-3f77-4666-822f-e40c1100c67f-kube-api-access-w9kb6\") pod \"metallb-operator-webhook-server-57ff77b6c8-sd485\" (UID: \"2c8911b3-3f77-4666-822f-e40c1100c67f\") " pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:51 crc kubenswrapper[4846]: I1122 09:26:51.904707 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:52 crc kubenswrapper[4846]: I1122 09:26:52.111760 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq"] Nov 22 09:26:52 crc kubenswrapper[4846]: W1122 09:26:52.121828 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9a28c92_48ff_4026_819b_70068881c12b.slice/crio-7777de3c59b59239e5eae7134887a039a2118a6db737f875324a3e1fe42ccab6 WatchSource:0}: Error finding container 7777de3c59b59239e5eae7134887a039a2118a6db737f875324a3e1fe42ccab6: Status 404 returned error can't find the container with id 7777de3c59b59239e5eae7134887a039a2118a6db737f875324a3e1fe42ccab6 Nov 22 09:26:52 crc kubenswrapper[4846]: I1122 09:26:52.411097 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" event={"ID":"a9a28c92-48ff-4026-819b-70068881c12b","Type":"ContainerStarted","Data":"7777de3c59b59239e5eae7134887a039a2118a6db737f875324a3e1fe42ccab6"} Nov 22 09:26:52 crc kubenswrapper[4846]: I1122 09:26:52.440706 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485"] Nov 22 09:26:52 crc kubenswrapper[4846]: W1122 09:26:52.444274 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c8911b3_3f77_4666_822f_e40c1100c67f.slice/crio-e233669594731f0eb5a66503bd0fc822693a36d8ef5ccee4cd651079a8a8da43 WatchSource:0}: Error finding container e233669594731f0eb5a66503bd0fc822693a36d8ef5ccee4cd651079a8a8da43: Status 404 returned error can't find the container with id e233669594731f0eb5a66503bd0fc822693a36d8ef5ccee4cd651079a8a8da43 Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.419202 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" event={"ID":"2c8911b3-3f77-4666-822f-e40c1100c67f","Type":"ContainerStarted","Data":"e233669594731f0eb5a66503bd0fc822693a36d8ef5ccee4cd651079a8a8da43"} Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.469451 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gxw69"] Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.470989 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.488862 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gxw69"] Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.567612 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ztt\" (UniqueName: \"kubernetes.io/projected/2157e9ef-592a-40b0-912c-733863d9d8df-kube-api-access-s4ztt\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.567699 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-utilities\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.567755 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-catalog-content\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.669816 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ztt\" (UniqueName: \"kubernetes.io/projected/2157e9ef-592a-40b0-912c-733863d9d8df-kube-api-access-s4ztt\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.669881 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-utilities\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.669919 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-catalog-content\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.670668 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-catalog-content\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.670779 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-utilities\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.695762 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ztt\" (UniqueName: \"kubernetes.io/projected/2157e9ef-592a-40b0-912c-733863d9d8df-kube-api-access-s4ztt\") pod \"community-operators-gxw69\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:53 crc kubenswrapper[4846]: I1122 09:26:53.797001 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:26:54 crc kubenswrapper[4846]: I1122 09:26:54.340081 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gxw69"] Nov 22 09:26:54 crc kubenswrapper[4846]: W1122 09:26:54.366843 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2157e9ef_592a_40b0_912c_733863d9d8df.slice/crio-acb7c6f271485f3bc7d5882bbf58c095d893bfc2efefef50055f21c97d5c663d WatchSource:0}: Error finding container acb7c6f271485f3bc7d5882bbf58c095d893bfc2efefef50055f21c97d5c663d: Status 404 returned error can't find the container with id acb7c6f271485f3bc7d5882bbf58c095d893bfc2efefef50055f21c97d5c663d Nov 22 09:26:54 crc kubenswrapper[4846]: I1122 09:26:54.429976 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerStarted","Data":"acb7c6f271485f3bc7d5882bbf58c095d893bfc2efefef50055f21c97d5c663d"} Nov 22 09:26:55 crc kubenswrapper[4846]: I1122 09:26:55.438967 4846 generic.go:334] "Generic (PLEG): container finished" podID="2157e9ef-592a-40b0-912c-733863d9d8df" containerID="00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30" exitCode=0 Nov 22 09:26:55 crc kubenswrapper[4846]: I1122 09:26:55.439143 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerDied","Data":"00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30"} Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.464623 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" event={"ID":"a9a28c92-48ff-4026-819b-70068881c12b","Type":"ContainerStarted","Data":"4168ae3842230f438e5f8f41400d5bfb5874683de2af1e9b92bad6741ba0c2a0"} Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.465316 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.468291 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" event={"ID":"2c8911b3-3f77-4666-822f-e40c1100c67f","Type":"ContainerStarted","Data":"99f4d5d29f853b2102f9456d72b6f3319f01e5cc3294d0d46decddf8e5edb49d"} Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.468363 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.471342 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerStarted","Data":"6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf"} Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.517559 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" podStartSLOduration=1.616745538 podStartE2EDuration="7.517529258s" podCreationTimestamp="2025-11-22 09:26:51 +0000 UTC" firstStartedPulling="2025-11-22 09:26:52.129516418 +0000 UTC m=+787.065206067" lastFinishedPulling="2025-11-22 09:26:58.030300138 +0000 UTC m=+792.965989787" observedRunningTime="2025-11-22 09:26:58.490133908 +0000 UTC m=+793.425823577" watchObservedRunningTime="2025-11-22 09:26:58.517529258 +0000 UTC m=+793.453218927" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.534826 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" podStartSLOduration=1.87911663 podStartE2EDuration="7.534807642s" podCreationTimestamp="2025-11-22 09:26:51 +0000 UTC" firstStartedPulling="2025-11-22 09:26:52.454430617 +0000 UTC m=+787.390120286" lastFinishedPulling="2025-11-22 09:26:58.110121649 +0000 UTC m=+793.045811298" observedRunningTime="2025-11-22 09:26:58.531908377 +0000 UTC m=+793.467598046" watchObservedRunningTime="2025-11-22 09:26:58.534807642 +0000 UTC m=+793.470497291" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.625549 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.625619 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.625670 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.626334 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98f9262a8d10b551be9acdbca7c91a24b8c83945ea853c86e2932b08cb27780b"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.626395 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://98f9262a8d10b551be9acdbca7c91a24b8c83945ea853c86e2932b08cb27780b" gracePeriod=600 Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.659022 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:58 crc kubenswrapper[4846]: I1122 09:26:58.706495 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:26:59 crc kubenswrapper[4846]: I1122 09:26:59.479798 4846 generic.go:334] "Generic (PLEG): container finished" podID="2157e9ef-592a-40b0-912c-733863d9d8df" containerID="6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf" exitCode=0 Nov 22 09:26:59 crc kubenswrapper[4846]: I1122 09:26:59.479856 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerDied","Data":"6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf"} Nov 22 09:26:59 crc kubenswrapper[4846]: I1122 09:26:59.486164 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="98f9262a8d10b551be9acdbca7c91a24b8c83945ea853c86e2932b08cb27780b" exitCode=0 Nov 22 09:26:59 crc kubenswrapper[4846]: I1122 09:26:59.486303 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"98f9262a8d10b551be9acdbca7c91a24b8c83945ea853c86e2932b08cb27780b"} Nov 22 09:26:59 crc kubenswrapper[4846]: I1122 09:26:59.486375 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"bac90441ca960230e02742d36a5b95524d70b371a6ee7e32b617df01413fca78"} Nov 22 09:26:59 crc kubenswrapper[4846]: I1122 09:26:59.486399 4846 scope.go:117] "RemoveContainer" containerID="2d6e3598b04fea951b6da83a54c8c53b23be887c3070db956e053d32e85f6afe" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.060250 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99vqt"] Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.061207 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-99vqt" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="registry-server" containerID="cri-o://544d578048e4f1eb9148ccb4174b0e98f61dcd8d4564d0e95b6f12613d4ac887" gracePeriod=2 Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.507006 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerStarted","Data":"3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d"} Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.511855 4846 generic.go:334] "Generic (PLEG): container finished" podID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerID="544d578048e4f1eb9148ccb4174b0e98f61dcd8d4564d0e95b6f12613d4ac887" exitCode=0 Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.511922 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerDied","Data":"544d578048e4f1eb9148ccb4174b0e98f61dcd8d4564d0e95b6f12613d4ac887"} Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.512005 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99vqt" event={"ID":"dcb7aa59-21a8-4483-b4fc-56c3d7883d77","Type":"ContainerDied","Data":"9843c90a034a66a7c4e059a8e87e51885ec8f13e7e595a4cba223b4422b6d348"} Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.512066 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9843c90a034a66a7c4e059a8e87e51885ec8f13e7e595a4cba223b4422b6d348" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.528776 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gxw69" podStartSLOduration=4.227186164 podStartE2EDuration="8.528757329s" podCreationTimestamp="2025-11-22 09:26:53 +0000 UTC" firstStartedPulling="2025-11-22 09:26:56.176119798 +0000 UTC m=+791.111809447" lastFinishedPulling="2025-11-22 09:27:00.477690963 +0000 UTC m=+795.413380612" observedRunningTime="2025-11-22 09:27:01.526101002 +0000 UTC m=+796.461790651" watchObservedRunningTime="2025-11-22 09:27:01.528757329 +0000 UTC m=+796.464446988" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.537204 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.695299 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-utilities\") pod \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.695374 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-catalog-content\") pod \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.695436 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lls\" (UniqueName: \"kubernetes.io/projected/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-kube-api-access-m2lls\") pod \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\" (UID: \"dcb7aa59-21a8-4483-b4fc-56c3d7883d77\") " Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.696156 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-utilities" (OuterVolumeSpecName: "utilities") pod "dcb7aa59-21a8-4483-b4fc-56c3d7883d77" (UID: "dcb7aa59-21a8-4483-b4fc-56c3d7883d77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.701852 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-kube-api-access-m2lls" (OuterVolumeSpecName: "kube-api-access-m2lls") pod "dcb7aa59-21a8-4483-b4fc-56c3d7883d77" (UID: "dcb7aa59-21a8-4483-b4fc-56c3d7883d77"). InnerVolumeSpecName "kube-api-access-m2lls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.789654 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcb7aa59-21a8-4483-b4fc-56c3d7883d77" (UID: "dcb7aa59-21a8-4483-b4fc-56c3d7883d77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.797154 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.797204 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:01 crc kubenswrapper[4846]: I1122 09:27:01.797224 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lls\" (UniqueName: \"kubernetes.io/projected/dcb7aa59-21a8-4483-b4fc-56c3d7883d77-kube-api-access-m2lls\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:02 crc kubenswrapper[4846]: I1122 09:27:02.517213 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99vqt" Nov 22 09:27:02 crc kubenswrapper[4846]: I1122 09:27:02.543589 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99vqt"] Nov 22 09:27:02 crc kubenswrapper[4846]: I1122 09:27:02.551000 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-99vqt"] Nov 22 09:27:03 crc kubenswrapper[4846]: I1122 09:27:03.802243 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:27:03 crc kubenswrapper[4846]: I1122 09:27:03.802351 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:27:03 crc kubenswrapper[4846]: I1122 09:27:03.862254 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:27:04 crc kubenswrapper[4846]: I1122 09:27:04.043082 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" path="/var/lib/kubelet/pods/dcb7aa59-21a8-4483-b4fc-56c3d7883d77/volumes" Nov 22 09:27:11 crc kubenswrapper[4846]: I1122 09:27:11.915947 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57ff77b6c8-sd485" Nov 22 09:27:13 crc kubenswrapper[4846]: I1122 09:27:13.872857 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:27:13 crc kubenswrapper[4846]: I1122 09:27:13.933784 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gxw69"] Nov 22 09:27:14 crc kubenswrapper[4846]: I1122 09:27:14.628538 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gxw69" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="registry-server" containerID="cri-o://3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d" gracePeriod=2 Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.051824 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.190922 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4ztt\" (UniqueName: \"kubernetes.io/projected/2157e9ef-592a-40b0-912c-733863d9d8df-kube-api-access-s4ztt\") pod \"2157e9ef-592a-40b0-912c-733863d9d8df\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.191640 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-utilities\") pod \"2157e9ef-592a-40b0-912c-733863d9d8df\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.191706 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-catalog-content\") pod \"2157e9ef-592a-40b0-912c-733863d9d8df\" (UID: \"2157e9ef-592a-40b0-912c-733863d9d8df\") " Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.194283 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-utilities" (OuterVolumeSpecName: "utilities") pod "2157e9ef-592a-40b0-912c-733863d9d8df" (UID: "2157e9ef-592a-40b0-912c-733863d9d8df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.214107 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2157e9ef-592a-40b0-912c-733863d9d8df-kube-api-access-s4ztt" (OuterVolumeSpecName: "kube-api-access-s4ztt") pod "2157e9ef-592a-40b0-912c-733863d9d8df" (UID: "2157e9ef-592a-40b0-912c-733863d9d8df"). InnerVolumeSpecName "kube-api-access-s4ztt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.256241 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2157e9ef-592a-40b0-912c-733863d9d8df" (UID: "2157e9ef-592a-40b0-912c-733863d9d8df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.294111 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.294150 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e9ef-592a-40b0-912c-733863d9d8df-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.294167 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4ztt\" (UniqueName: \"kubernetes.io/projected/2157e9ef-592a-40b0-912c-733863d9d8df-kube-api-access-s4ztt\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.640280 4846 generic.go:334] "Generic (PLEG): container finished" podID="2157e9ef-592a-40b0-912c-733863d9d8df" containerID="3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d" exitCode=0 Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.640368 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerDied","Data":"3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d"} Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.640453 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gxw69" event={"ID":"2157e9ef-592a-40b0-912c-733863d9d8df","Type":"ContainerDied","Data":"acb7c6f271485f3bc7d5882bbf58c095d893bfc2efefef50055f21c97d5c663d"} Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.640482 4846 scope.go:117] "RemoveContainer" containerID="3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.640399 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gxw69" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.666437 4846 scope.go:117] "RemoveContainer" containerID="6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.674100 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gxw69"] Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.679125 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gxw69"] Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.689499 4846 scope.go:117] "RemoveContainer" containerID="00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.715294 4846 scope.go:117] "RemoveContainer" containerID="3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d" Nov 22 09:27:15 crc kubenswrapper[4846]: E1122 09:27:15.715914 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d\": container with ID starting with 3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d not found: ID does not exist" containerID="3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.715957 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d"} err="failed to get container status \"3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d\": rpc error: code = NotFound desc = could not find container \"3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d\": container with ID starting with 3b7b49fe125df80654a5c29b2e9065df3b35ab762cfcb7fc998342462f97874d not found: ID does not exist" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.715986 4846 scope.go:117] "RemoveContainer" containerID="6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf" Nov 22 09:27:15 crc kubenswrapper[4846]: E1122 09:27:15.716275 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf\": container with ID starting with 6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf not found: ID does not exist" containerID="6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.716302 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf"} err="failed to get container status \"6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf\": rpc error: code = NotFound desc = could not find container \"6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf\": container with ID starting with 6ca3baed62241e64d08d5db059ec2bb7daa5454a3a711821bb86195531a17acf not found: ID does not exist" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.716319 4846 scope.go:117] "RemoveContainer" containerID="00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30" Nov 22 09:27:15 crc kubenswrapper[4846]: E1122 09:27:15.716707 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30\": container with ID starting with 00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30 not found: ID does not exist" containerID="00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30" Nov 22 09:27:15 crc kubenswrapper[4846]: I1122 09:27:15.716729 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30"} err="failed to get container status \"00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30\": rpc error: code = NotFound desc = could not find container \"00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30\": container with ID starting with 00d6078eba9d331c0271d1df9d5e711ff6ee9e95b0b086e3694f0ca4724e5a30 not found: ID does not exist" Nov 22 09:27:16 crc kubenswrapper[4846]: I1122 09:27:16.047208 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" path="/var/lib/kubelet/pods/2157e9ef-592a-40b0-912c-733863d9d8df/volumes" Nov 22 09:27:31 crc kubenswrapper[4846]: I1122 09:27:31.589925 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6b9465489d-lwlfq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.331610 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4btm7"] Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.331925 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="registry-server" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.331941 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="registry-server" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.331955 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="extract-utilities" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.331964 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="extract-utilities" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.331980 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="extract-content" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.331990 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="extract-content" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.332003 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="registry-server" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.332011 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="registry-server" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.332036 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="extract-utilities" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.332061 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="extract-utilities" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.332076 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="extract-content" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.332083 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="extract-content" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.332209 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2157e9ef-592a-40b0-912c-733863d9d8df" containerName="registry-server" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.332222 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb7aa59-21a8-4483-b4fc-56c3d7883d77" containerName="registry-server" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.334572 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.337868 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.337980 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6w6ph" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.339487 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.342578 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr"] Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.343619 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.345391 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.362561 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr"] Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.443768 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jtgbm"] Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.444786 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.447901 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.447936 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.447943 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wqrjs" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.447993 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458457 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn4qm\" (UniqueName: \"kubernetes.io/projected/93d95094-1954-4055-b057-9c94763afc6f-kube-api-access-xn4qm\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458522 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-frr-conf\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458547 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-metrics\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458575 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/b4e18041-980a-4cbb-ba17-98b3f6032c57-kube-api-access-qtvtt\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458598 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tglxl\" (UniqueName: \"kubernetes.io/projected/ae71f435-af46-44ac-afdb-57dea9cd1925-kube-api-access-tglxl\") pod \"frr-k8s-webhook-server-6998585d5-g4nfr\" (UID: \"ae71f435-af46-44ac-afdb-57dea9cd1925\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458647 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae71f435-af46-44ac-afdb-57dea9cd1925-cert\") pod \"frr-k8s-webhook-server-6998585d5-g4nfr\" (UID: \"ae71f435-af46-44ac-afdb-57dea9cd1925\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458677 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-reloader\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458713 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4e18041-980a-4cbb-ba17-98b3f6032c57-metallb-excludel2\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458737 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-metrics-certs\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458759 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/93d95094-1954-4055-b057-9c94763afc6f-frr-startup\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458794 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93d95094-1954-4055-b057-9c94763afc6f-metrics-certs\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458814 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-frr-sockets\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.458835 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.459338 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-z5dqq"] Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.461863 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.471964 4846 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.486211 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-z5dqq"] Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.559974 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93d95094-1954-4055-b057-9c94763afc6f-metrics-certs\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560065 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-frr-sockets\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560088 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560117 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn4qm\" (UniqueName: \"kubernetes.io/projected/93d95094-1954-4055-b057-9c94763afc6f-kube-api-access-xn4qm\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560151 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-frr-conf\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560169 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-metrics\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560187 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/b4e18041-980a-4cbb-ba17-98b3f6032c57-kube-api-access-qtvtt\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560205 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tglxl\" (UniqueName: \"kubernetes.io/projected/ae71f435-af46-44ac-afdb-57dea9cd1925-kube-api-access-tglxl\") pod \"frr-k8s-webhook-server-6998585d5-g4nfr\" (UID: \"ae71f435-af46-44ac-afdb-57dea9cd1925\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560258 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae71f435-af46-44ac-afdb-57dea9cd1925-cert\") pod \"frr-k8s-webhook-server-6998585d5-g4nfr\" (UID: \"ae71f435-af46-44ac-afdb-57dea9cd1925\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560282 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-reloader\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560309 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4e18041-980a-4cbb-ba17-98b3f6032c57-metallb-excludel2\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560326 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/93d95094-1954-4055-b057-9c94763afc6f-frr-startup\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560347 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-metrics-certs\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.560519 4846 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.560599 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-metrics-certs podName:b4e18041-980a-4cbb-ba17-98b3f6032c57 nodeName:}" failed. No retries permitted until 2025-11-22 09:27:33.060573924 +0000 UTC m=+827.996263573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-metrics-certs") pod "speaker-jtgbm" (UID: "b4e18041-980a-4cbb-ba17-98b3f6032c57") : secret "speaker-certs-secret" not found Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.560664 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-frr-conf\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.562316 4846 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 09:27:32 crc kubenswrapper[4846]: E1122 09:27:32.562611 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist podName:b4e18041-980a-4cbb-ba17-98b3f6032c57 nodeName:}" failed. No retries permitted until 2025-11-22 09:27:33.062584503 +0000 UTC m=+827.998274392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist") pod "speaker-jtgbm" (UID: "b4e18041-980a-4cbb-ba17-98b3f6032c57") : secret "metallb-memberlist" not found Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.565436 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/93d95094-1954-4055-b057-9c94763afc6f-frr-startup\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.566179 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4e18041-980a-4cbb-ba17-98b3f6032c57-metallb-excludel2\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.566186 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-frr-sockets\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.566614 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-metrics\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.566685 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/93d95094-1954-4055-b057-9c94763afc6f-reloader\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.579816 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93d95094-1954-4055-b057-9c94763afc6f-metrics-certs\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.580818 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/b4e18041-980a-4cbb-ba17-98b3f6032c57-kube-api-access-qtvtt\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.585490 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn4qm\" (UniqueName: \"kubernetes.io/projected/93d95094-1954-4055-b057-9c94763afc6f-kube-api-access-xn4qm\") pod \"frr-k8s-4btm7\" (UID: \"93d95094-1954-4055-b057-9c94763afc6f\") " pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.590694 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tglxl\" (UniqueName: \"kubernetes.io/projected/ae71f435-af46-44ac-afdb-57dea9cd1925-kube-api-access-tglxl\") pod \"frr-k8s-webhook-server-6998585d5-g4nfr\" (UID: \"ae71f435-af46-44ac-afdb-57dea9cd1925\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.590768 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae71f435-af46-44ac-afdb-57dea9cd1925-cert\") pod \"frr-k8s-webhook-server-6998585d5-g4nfr\" (UID: \"ae71f435-af46-44ac-afdb-57dea9cd1925\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.661237 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/425496ff-38a1-4d67-b702-9bb864465158-cert\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.661316 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k8dh\" (UniqueName: \"kubernetes.io/projected/425496ff-38a1-4d67-b702-9bb864465158-kube-api-access-9k8dh\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.661345 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425496ff-38a1-4d67-b702-9bb864465158-metrics-certs\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.663779 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.670435 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.762807 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/425496ff-38a1-4d67-b702-9bb864465158-cert\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.764019 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k8dh\" (UniqueName: \"kubernetes.io/projected/425496ff-38a1-4d67-b702-9bb864465158-kube-api-access-9k8dh\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.764094 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425496ff-38a1-4d67-b702-9bb864465158-metrics-certs\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.768991 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/425496ff-38a1-4d67-b702-9bb864465158-metrics-certs\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.772532 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/425496ff-38a1-4d67-b702-9bb864465158-cert\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.782852 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k8dh\" (UniqueName: \"kubernetes.io/projected/425496ff-38a1-4d67-b702-9bb864465158-kube-api-access-9k8dh\") pod \"controller-6c7b4b5f48-z5dqq\" (UID: \"425496ff-38a1-4d67-b702-9bb864465158\") " pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:32 crc kubenswrapper[4846]: I1122 09:27:32.807770 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.073016 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-metrics-certs\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.073595 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:33 crc kubenswrapper[4846]: E1122 09:27:33.073767 4846 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 09:27:33 crc kubenswrapper[4846]: E1122 09:27:33.073865 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist podName:b4e18041-980a-4cbb-ba17-98b3f6032c57 nodeName:}" failed. No retries permitted until 2025-11-22 09:27:34.073842074 +0000 UTC m=+829.009531723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist") pod "speaker-jtgbm" (UID: "b4e18041-980a-4cbb-ba17-98b3f6032c57") : secret "metallb-memberlist" not found Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.078578 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-metrics-certs\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.172574 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr"] Nov 22 09:27:33 crc kubenswrapper[4846]: W1122 09:27:33.176620 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae71f435_af46_44ac_afdb_57dea9cd1925.slice/crio-678fb5b91e8808aa2f8e56c5a61a8d802a88a90e6f1fc715b0810b80fa4e4d23 WatchSource:0}: Error finding container 678fb5b91e8808aa2f8e56c5a61a8d802a88a90e6f1fc715b0810b80fa4e4d23: Status 404 returned error can't find the container with id 678fb5b91e8808aa2f8e56c5a61a8d802a88a90e6f1fc715b0810b80fa4e4d23 Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.226503 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-z5dqq"] Nov 22 09:27:33 crc kubenswrapper[4846]: W1122 09:27:33.231544 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod425496ff_38a1_4d67_b702_9bb864465158.slice/crio-dc883d75f02dc02b14af8b5c1d52a9f706d2a0b1241680a3265cc13ebaeb3b1d WatchSource:0}: Error finding container dc883d75f02dc02b14af8b5c1d52a9f706d2a0b1241680a3265cc13ebaeb3b1d: Status 404 returned error can't find the container with id dc883d75f02dc02b14af8b5c1d52a9f706d2a0b1241680a3265cc13ebaeb3b1d Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.766555 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" event={"ID":"ae71f435-af46-44ac-afdb-57dea9cd1925","Type":"ContainerStarted","Data":"678fb5b91e8808aa2f8e56c5a61a8d802a88a90e6f1fc715b0810b80fa4e4d23"} Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.770001 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-z5dqq" event={"ID":"425496ff-38a1-4d67-b702-9bb864465158","Type":"ContainerStarted","Data":"64c33ce571881f16b7479aae8c7d3ac07a3f418847504f7700f666c3b588c629"} Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.770138 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-z5dqq" event={"ID":"425496ff-38a1-4d67-b702-9bb864465158","Type":"ContainerStarted","Data":"edb8165879d31b66f0348a3e7249b5fba75bcffaec71741dc3026a228554c283"} Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.770157 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-z5dqq" event={"ID":"425496ff-38a1-4d67-b702-9bb864465158","Type":"ContainerStarted","Data":"dc883d75f02dc02b14af8b5c1d52a9f706d2a0b1241680a3265cc13ebaeb3b1d"} Nov 22 09:27:33 crc kubenswrapper[4846]: I1122 09:27:33.772035 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"c6289ab8764e53c3a0637c859741cd608a0d21fb048f7b84df39fb784f0f0f1b"} Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.086136 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.096114 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4e18041-980a-4cbb-ba17-98b3f6032c57-memberlist\") pod \"speaker-jtgbm\" (UID: \"b4e18041-980a-4cbb-ba17-98b3f6032c57\") " pod="metallb-system/speaker-jtgbm" Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.270430 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jtgbm" Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.793501 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jtgbm" event={"ID":"b4e18041-980a-4cbb-ba17-98b3f6032c57","Type":"ContainerStarted","Data":"68cd18209096c4d0a5b367b143db6d6ea575fb7614a994e2174b8dd3cf0bd400"} Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.793957 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jtgbm" event={"ID":"b4e18041-980a-4cbb-ba17-98b3f6032c57","Type":"ContainerStarted","Data":"8cdff693c9b84e59e5e7812a56cf22d2ed856d650c5286155c563b01d5c2ebd7"} Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.793987 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:34 crc kubenswrapper[4846]: I1122 09:27:34.998566 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-z5dqq" podStartSLOduration=2.998545785 podStartE2EDuration="2.998545785s" podCreationTimestamp="2025-11-22 09:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:27:33.796282463 +0000 UTC m=+828.731972152" watchObservedRunningTime="2025-11-22 09:27:34.998545785 +0000 UTC m=+829.934235434" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.000338 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t5mlg"] Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.001519 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.018889 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5mlg"] Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.202491 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-catalog-content\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.202733 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-utilities\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.202933 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g75gt\" (UniqueName: \"kubernetes.io/projected/18ca1b11-29c0-4b0a-9095-55833df48d20-kube-api-access-g75gt\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.304168 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-catalog-content\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.304247 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-utilities\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.304313 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g75gt\" (UniqueName: \"kubernetes.io/projected/18ca1b11-29c0-4b0a-9095-55833df48d20-kube-api-access-g75gt\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.304829 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-catalog-content\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.304917 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-utilities\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.326674 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g75gt\" (UniqueName: \"kubernetes.io/projected/18ca1b11-29c0-4b0a-9095-55833df48d20-kube-api-access-g75gt\") pod \"redhat-marketplace-t5mlg\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.617468 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.818511 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jtgbm" event={"ID":"b4e18041-980a-4cbb-ba17-98b3f6032c57","Type":"ContainerStarted","Data":"e2e1803d2ca29e26835d6922d08141abcef3b73cae6296afdb7819fd3b98b4ec"} Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.819303 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jtgbm" Nov 22 09:27:35 crc kubenswrapper[4846]: I1122 09:27:35.843653 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jtgbm" podStartSLOduration=3.843629966 podStartE2EDuration="3.843629966s" podCreationTimestamp="2025-11-22 09:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:27:35.8389733 +0000 UTC m=+830.774662959" watchObservedRunningTime="2025-11-22 09:27:35.843629966 +0000 UTC m=+830.779319615" Nov 22 09:27:36 crc kubenswrapper[4846]: I1122 09:27:36.249181 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5mlg"] Nov 22 09:27:36 crc kubenswrapper[4846]: I1122 09:27:36.834475 4846 generic.go:334] "Generic (PLEG): container finished" podID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerID="0f7ce78c2e6188edc16e32859f892dd53144f35a7137559d464cd67501ac0587" exitCode=0 Nov 22 09:27:36 crc kubenswrapper[4846]: I1122 09:27:36.835668 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5mlg" event={"ID":"18ca1b11-29c0-4b0a-9095-55833df48d20","Type":"ContainerDied","Data":"0f7ce78c2e6188edc16e32859f892dd53144f35a7137559d464cd67501ac0587"} Nov 22 09:27:36 crc kubenswrapper[4846]: I1122 09:27:36.835766 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5mlg" event={"ID":"18ca1b11-29c0-4b0a-9095-55833df48d20","Type":"ContainerStarted","Data":"da546ae0032436964dfe1d70f73295a1a1dc441360df9df91a92443589d022fb"} Nov 22 09:27:37 crc kubenswrapper[4846]: I1122 09:27:37.854355 4846 generic.go:334] "Generic (PLEG): container finished" podID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerID="156f0f5edf3d766d73b055b2c20e33e8adce99e00e7ad475da6b7d6f8e025d7f" exitCode=0 Nov 22 09:27:37 crc kubenswrapper[4846]: I1122 09:27:37.855007 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5mlg" event={"ID":"18ca1b11-29c0-4b0a-9095-55833df48d20","Type":"ContainerDied","Data":"156f0f5edf3d766d73b055b2c20e33e8adce99e00e7ad475da6b7d6f8e025d7f"} Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.889132 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5mlg" event={"ID":"18ca1b11-29c0-4b0a-9095-55833df48d20","Type":"ContainerStarted","Data":"1e16e3ffdaa9fdc39caa0310c66be48cae72c445acb460bf725e504913eb8237"} Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.891109 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" event={"ID":"ae71f435-af46-44ac-afdb-57dea9cd1925","Type":"ContainerStarted","Data":"9fab8c86afc65b3534a8d72930af6459f3438cc2a12dea009ef983f77a372159"} Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.891215 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.892791 4846 generic.go:334] "Generic (PLEG): container finished" podID="93d95094-1954-4055-b057-9c94763afc6f" containerID="908dbf0c85dbeb92edccb6376769c03dff87890297bc4c43e0a050f4a64d709e" exitCode=0 Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.892835 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerDied","Data":"908dbf0c85dbeb92edccb6376769c03dff87890297bc4c43e0a050f4a64d709e"} Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.918249 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t5mlg" podStartSLOduration=3.2212675 podStartE2EDuration="6.918229858s" podCreationTimestamp="2025-11-22 09:27:34 +0000 UTC" firstStartedPulling="2025-11-22 09:27:36.837834401 +0000 UTC m=+831.773524060" lastFinishedPulling="2025-11-22 09:27:40.534796759 +0000 UTC m=+835.470486418" observedRunningTime="2025-11-22 09:27:40.916120536 +0000 UTC m=+835.851810185" watchObservedRunningTime="2025-11-22 09:27:40.918229858 +0000 UTC m=+835.853919507" Nov 22 09:27:40 crc kubenswrapper[4846]: I1122 09:27:40.972412 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" podStartSLOduration=1.577492453 podStartE2EDuration="8.972388309s" podCreationTimestamp="2025-11-22 09:27:32 +0000 UTC" firstStartedPulling="2025-11-22 09:27:33.17948921 +0000 UTC m=+828.115178859" lastFinishedPulling="2025-11-22 09:27:40.574385066 +0000 UTC m=+835.510074715" observedRunningTime="2025-11-22 09:27:40.970171785 +0000 UTC m=+835.905861444" watchObservedRunningTime="2025-11-22 09:27:40.972388309 +0000 UTC m=+835.908077958" Nov 22 09:27:41 crc kubenswrapper[4846]: I1122 09:27:41.901909 4846 generic.go:334] "Generic (PLEG): container finished" podID="93d95094-1954-4055-b057-9c94763afc6f" containerID="b42439f149aeb5f59f681748a7f10812d60c76bbe3a8146500df963c707cd3e9" exitCode=0 Nov 22 09:27:41 crc kubenswrapper[4846]: I1122 09:27:41.902146 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerDied","Data":"b42439f149aeb5f59f681748a7f10812d60c76bbe3a8146500df963c707cd3e9"} Nov 22 09:27:42 crc kubenswrapper[4846]: I1122 09:27:42.917923 4846 generic.go:334] "Generic (PLEG): container finished" podID="93d95094-1954-4055-b057-9c94763afc6f" containerID="27368cbc4e6fa1f4c56ccc915c5a0b765660c8651e137f7f66b0125bc73e6b75" exitCode=0 Nov 22 09:27:42 crc kubenswrapper[4846]: I1122 09:27:42.917997 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerDied","Data":"27368cbc4e6fa1f4c56ccc915c5a0b765660c8651e137f7f66b0125bc73e6b75"} Nov 22 09:27:43 crc kubenswrapper[4846]: I1122 09:27:43.932117 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"e567155fb705f03477ca31be0b0ad5203137027dd5518663d7c45e31dc5984b7"} Nov 22 09:27:43 crc kubenswrapper[4846]: I1122 09:27:43.932933 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"d9c1ec3d43636d4a2b3bd71abb8815d51e88b084292e8b8a5566883915a682b9"} Nov 22 09:27:43 crc kubenswrapper[4846]: I1122 09:27:43.932951 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"b8aed0dce0b552113afed773f3cac9b59a174c3f2af13f4e6b4d2557ec979a70"} Nov 22 09:27:43 crc kubenswrapper[4846]: I1122 09:27:43.932963 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"e0f379e05e52b3a6ef9541813e3f02076f56eb57cbfafb37edeb53f8bcaad1e8"} Nov 22 09:27:43 crc kubenswrapper[4846]: I1122 09:27:43.932972 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"561d249c7c6ce3336ee3102a907163ef442ebf5061d08995d76084908e18b49a"} Nov 22 09:27:44 crc kubenswrapper[4846]: I1122 09:27:44.275587 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jtgbm" Nov 22 09:27:44 crc kubenswrapper[4846]: I1122 09:27:44.946176 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4btm7" event={"ID":"93d95094-1954-4055-b057-9c94763afc6f","Type":"ContainerStarted","Data":"528a622930ed38c4123c9ef6fa5587cfdc191e6c69c9e6de6e9d1ac32cad9fdb"} Nov 22 09:27:44 crc kubenswrapper[4846]: I1122 09:27:44.947502 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:44 crc kubenswrapper[4846]: I1122 09:27:44.982159 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4btm7" podStartSLOduration=5.285792173 podStartE2EDuration="12.982131873s" podCreationTimestamp="2025-11-22 09:27:32 +0000 UTC" firstStartedPulling="2025-11-22 09:27:32.861234325 +0000 UTC m=+827.796923974" lastFinishedPulling="2025-11-22 09:27:40.557574025 +0000 UTC m=+835.493263674" observedRunningTime="2025-11-22 09:27:44.980183536 +0000 UTC m=+839.915873275" watchObservedRunningTime="2025-11-22 09:27:44.982131873 +0000 UTC m=+839.917821532" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.179815 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2l8qs"] Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.180982 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.196202 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l8qs"] Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.262890 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-utilities\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.262952 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-catalog-content\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.262990 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6fj\" (UniqueName: \"kubernetes.io/projected/6e455fe0-cf47-4202-a76e-08b4ddc90606-kube-api-access-6p6fj\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.364481 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-utilities\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.364557 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-catalog-content\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.364614 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6fj\" (UniqueName: \"kubernetes.io/projected/6e455fe0-cf47-4202-a76e-08b4ddc90606-kube-api-access-6p6fj\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.365245 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-utilities\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.365432 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-catalog-content\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.398985 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6fj\" (UniqueName: \"kubernetes.io/projected/6e455fe0-cf47-4202-a76e-08b4ddc90606-kube-api-access-6p6fj\") pod \"certified-operators-2l8qs\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.531818 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.617846 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.618162 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:45 crc kubenswrapper[4846]: I1122 09:27:45.676973 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:46 crc kubenswrapper[4846]: I1122 09:27:46.005239 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:46 crc kubenswrapper[4846]: I1122 09:27:46.181527 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l8qs"] Nov 22 09:27:46 crc kubenswrapper[4846]: W1122 09:27:46.189060 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e455fe0_cf47_4202_a76e_08b4ddc90606.slice/crio-1a0286ace8f761f15ffb155786753fbcaefd9d7ddce979346cc0c58d8060082e WatchSource:0}: Error finding container 1a0286ace8f761f15ffb155786753fbcaefd9d7ddce979346cc0c58d8060082e: Status 404 returned error can't find the container with id 1a0286ace8f761f15ffb155786753fbcaefd9d7ddce979346cc0c58d8060082e Nov 22 09:27:46 crc kubenswrapper[4846]: I1122 09:27:46.966359 4846 generic.go:334] "Generic (PLEG): container finished" podID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerID="4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6" exitCode=0 Nov 22 09:27:46 crc kubenswrapper[4846]: I1122 09:27:46.966494 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerDied","Data":"4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6"} Nov 22 09:27:46 crc kubenswrapper[4846]: I1122 09:27:46.967032 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerStarted","Data":"1a0286ace8f761f15ffb155786753fbcaefd9d7ddce979346cc0c58d8060082e"} Nov 22 09:27:47 crc kubenswrapper[4846]: I1122 09:27:47.665233 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:47 crc kubenswrapper[4846]: I1122 09:27:47.711426 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4btm7" Nov 22 09:27:47 crc kubenswrapper[4846]: I1122 09:27:47.978133 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerStarted","Data":"45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b"} Nov 22 09:27:48 crc kubenswrapper[4846]: I1122 09:27:48.147942 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5mlg"] Nov 22 09:27:48 crc kubenswrapper[4846]: I1122 09:27:48.148218 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t5mlg" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="registry-server" containerID="cri-o://1e16e3ffdaa9fdc39caa0310c66be48cae72c445acb460bf725e504913eb8237" gracePeriod=2 Nov 22 09:27:48 crc kubenswrapper[4846]: I1122 09:27:48.989509 4846 generic.go:334] "Generic (PLEG): container finished" podID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerID="1e16e3ffdaa9fdc39caa0310c66be48cae72c445acb460bf725e504913eb8237" exitCode=0 Nov 22 09:27:48 crc kubenswrapper[4846]: I1122 09:27:48.989599 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5mlg" event={"ID":"18ca1b11-29c0-4b0a-9095-55833df48d20","Type":"ContainerDied","Data":"1e16e3ffdaa9fdc39caa0310c66be48cae72c445acb460bf725e504913eb8237"} Nov 22 09:27:48 crc kubenswrapper[4846]: I1122 09:27:48.992273 4846 generic.go:334] "Generic (PLEG): container finished" podID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerID="45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b" exitCode=0 Nov 22 09:27:48 crc kubenswrapper[4846]: I1122 09:27:48.992342 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerDied","Data":"45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b"} Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.403397 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.530092 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g75gt\" (UniqueName: \"kubernetes.io/projected/18ca1b11-29c0-4b0a-9095-55833df48d20-kube-api-access-g75gt\") pod \"18ca1b11-29c0-4b0a-9095-55833df48d20\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.530167 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-catalog-content\") pod \"18ca1b11-29c0-4b0a-9095-55833df48d20\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.530232 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-utilities\") pod \"18ca1b11-29c0-4b0a-9095-55833df48d20\" (UID: \"18ca1b11-29c0-4b0a-9095-55833df48d20\") " Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.532088 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-utilities" (OuterVolumeSpecName: "utilities") pod "18ca1b11-29c0-4b0a-9095-55833df48d20" (UID: "18ca1b11-29c0-4b0a-9095-55833df48d20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.537229 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ca1b11-29c0-4b0a-9095-55833df48d20-kube-api-access-g75gt" (OuterVolumeSpecName: "kube-api-access-g75gt") pod "18ca1b11-29c0-4b0a-9095-55833df48d20" (UID: "18ca1b11-29c0-4b0a-9095-55833df48d20"). InnerVolumeSpecName "kube-api-access-g75gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.550111 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18ca1b11-29c0-4b0a-9095-55833df48d20" (UID: "18ca1b11-29c0-4b0a-9095-55833df48d20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.632565 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.632602 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g75gt\" (UniqueName: \"kubernetes.io/projected/18ca1b11-29c0-4b0a-9095-55833df48d20-kube-api-access-g75gt\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:49 crc kubenswrapper[4846]: I1122 09:27:49.632616 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ca1b11-29c0-4b0a-9095-55833df48d20-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.006711 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5mlg" event={"ID":"18ca1b11-29c0-4b0a-9095-55833df48d20","Type":"ContainerDied","Data":"da546ae0032436964dfe1d70f73295a1a1dc441360df9df91a92443589d022fb"} Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.006772 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5mlg" Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.007231 4846 scope.go:117] "RemoveContainer" containerID="1e16e3ffdaa9fdc39caa0310c66be48cae72c445acb460bf725e504913eb8237" Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.009869 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerStarted","Data":"977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4"} Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.033627 4846 scope.go:117] "RemoveContainer" containerID="156f0f5edf3d766d73b055b2c20e33e8adce99e00e7ad475da6b7d6f8e025d7f" Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.043075 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2l8qs" podStartSLOduration=2.482362424 podStartE2EDuration="5.043018666s" podCreationTimestamp="2025-11-22 09:27:45 +0000 UTC" firstStartedPulling="2025-11-22 09:27:46.970643506 +0000 UTC m=+841.906333165" lastFinishedPulling="2025-11-22 09:27:49.531299728 +0000 UTC m=+844.466989407" observedRunningTime="2025-11-22 09:27:50.039112083 +0000 UTC m=+844.974801742" watchObservedRunningTime="2025-11-22 09:27:50.043018666 +0000 UTC m=+844.978708355" Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.062094 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5mlg"] Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.067456 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5mlg"] Nov 22 09:27:50 crc kubenswrapper[4846]: I1122 09:27:50.076607 4846 scope.go:117] "RemoveContainer" containerID="0f7ce78c2e6188edc16e32859f892dd53144f35a7137559d464cd67501ac0587" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.046555 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" path="/var/lib/kubelet/pods/18ca1b11-29c0-4b0a-9095-55833df48d20/volumes" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.165171 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bh6tp"] Nov 22 09:27:52 crc kubenswrapper[4846]: E1122 09:27:52.165625 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="extract-utilities" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.165666 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="extract-utilities" Nov 22 09:27:52 crc kubenswrapper[4846]: E1122 09:27:52.165697 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="registry-server" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.165714 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="registry-server" Nov 22 09:27:52 crc kubenswrapper[4846]: E1122 09:27:52.165755 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="extract-content" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.165772 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="extract-content" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.166022 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ca1b11-29c0-4b0a-9095-55833df48d20" containerName="registry-server" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.167096 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.170533 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.170810 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.172960 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bh6tp"] Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.175025 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cjnjb" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.275763 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs486\" (UniqueName: \"kubernetes.io/projected/562d6113-df0b-4993-b7b8-1cace4f13fe0-kube-api-access-xs486\") pod \"openstack-operator-index-bh6tp\" (UID: \"562d6113-df0b-4993-b7b8-1cace4f13fe0\") " pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.377799 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs486\" (UniqueName: \"kubernetes.io/projected/562d6113-df0b-4993-b7b8-1cace4f13fe0-kube-api-access-xs486\") pod \"openstack-operator-index-bh6tp\" (UID: \"562d6113-df0b-4993-b7b8-1cace4f13fe0\") " pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.402637 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs486\" (UniqueName: \"kubernetes.io/projected/562d6113-df0b-4993-b7b8-1cace4f13fe0-kube-api-access-xs486\") pod \"openstack-operator-index-bh6tp\" (UID: \"562d6113-df0b-4993-b7b8-1cace4f13fe0\") " pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.544821 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.679420 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-g4nfr" Nov 22 09:27:52 crc kubenswrapper[4846]: I1122 09:27:52.812675 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-z5dqq" Nov 22 09:27:53 crc kubenswrapper[4846]: I1122 09:27:53.060924 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bh6tp"] Nov 22 09:27:53 crc kubenswrapper[4846]: W1122 09:27:53.064469 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562d6113_df0b_4993_b7b8_1cace4f13fe0.slice/crio-02455f0b50c6b38c2d6e09bc518ee13b1198b01ca6c1d3904b4df107ade555d1 WatchSource:0}: Error finding container 02455f0b50c6b38c2d6e09bc518ee13b1198b01ca6c1d3904b4df107ade555d1: Status 404 returned error can't find the container with id 02455f0b50c6b38c2d6e09bc518ee13b1198b01ca6c1d3904b4df107ade555d1 Nov 22 09:27:54 crc kubenswrapper[4846]: I1122 09:27:54.047446 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bh6tp" event={"ID":"562d6113-df0b-4993-b7b8-1cace4f13fe0","Type":"ContainerStarted","Data":"02455f0b50c6b38c2d6e09bc518ee13b1198b01ca6c1d3904b4df107ade555d1"} Nov 22 09:27:55 crc kubenswrapper[4846]: I1122 09:27:55.532619 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:55 crc kubenswrapper[4846]: I1122 09:27:55.532739 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:55 crc kubenswrapper[4846]: I1122 09:27:55.593548 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:56 crc kubenswrapper[4846]: I1122 09:27:56.126914 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:57 crc kubenswrapper[4846]: I1122 09:27:57.080719 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bh6tp" event={"ID":"562d6113-df0b-4993-b7b8-1cace4f13fe0","Type":"ContainerStarted","Data":"da4036a5d87f9fd89cfc199961b726cc5b40b9ee9780adfb64e56d5e244cd7ef"} Nov 22 09:27:57 crc kubenswrapper[4846]: I1122 09:27:57.116017 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bh6tp" podStartSLOduration=2.158212935 podStartE2EDuration="5.115938359s" podCreationTimestamp="2025-11-22 09:27:52 +0000 UTC" firstStartedPulling="2025-11-22 09:27:53.067072605 +0000 UTC m=+848.002762254" lastFinishedPulling="2025-11-22 09:27:56.024798029 +0000 UTC m=+850.960487678" observedRunningTime="2025-11-22 09:27:57.105998139 +0000 UTC m=+852.041687818" watchObservedRunningTime="2025-11-22 09:27:57.115938359 +0000 UTC m=+852.051628038" Nov 22 09:27:57 crc kubenswrapper[4846]: I1122 09:27:57.157091 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l8qs"] Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.095373 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2l8qs" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="registry-server" containerID="cri-o://977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4" gracePeriod=2 Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.581720 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.706364 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-utilities\") pod \"6e455fe0-cf47-4202-a76e-08b4ddc90606\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.706437 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6fj\" (UniqueName: \"kubernetes.io/projected/6e455fe0-cf47-4202-a76e-08b4ddc90606-kube-api-access-6p6fj\") pod \"6e455fe0-cf47-4202-a76e-08b4ddc90606\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.706521 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-catalog-content\") pod \"6e455fe0-cf47-4202-a76e-08b4ddc90606\" (UID: \"6e455fe0-cf47-4202-a76e-08b4ddc90606\") " Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.707561 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-utilities" (OuterVolumeSpecName: "utilities") pod "6e455fe0-cf47-4202-a76e-08b4ddc90606" (UID: "6e455fe0-cf47-4202-a76e-08b4ddc90606"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.715456 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e455fe0-cf47-4202-a76e-08b4ddc90606-kube-api-access-6p6fj" (OuterVolumeSpecName: "kube-api-access-6p6fj") pod "6e455fe0-cf47-4202-a76e-08b4ddc90606" (UID: "6e455fe0-cf47-4202-a76e-08b4ddc90606"). InnerVolumeSpecName "kube-api-access-6p6fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.750303 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e455fe0-cf47-4202-a76e-08b4ddc90606" (UID: "6e455fe0-cf47-4202-a76e-08b4ddc90606"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.808060 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.808118 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p6fj\" (UniqueName: \"kubernetes.io/projected/6e455fe0-cf47-4202-a76e-08b4ddc90606-kube-api-access-6p6fj\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:59 crc kubenswrapper[4846]: I1122 09:27:59.808146 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e455fe0-cf47-4202-a76e-08b4ddc90606-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.110037 4846 generic.go:334] "Generic (PLEG): container finished" podID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerID="977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4" exitCode=0 Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.110161 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerDied","Data":"977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4"} Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.110536 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l8qs" event={"ID":"6e455fe0-cf47-4202-a76e-08b4ddc90606","Type":"ContainerDied","Data":"1a0286ace8f761f15ffb155786753fbcaefd9d7ddce979346cc0c58d8060082e"} Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.110564 4846 scope.go:117] "RemoveContainer" containerID="977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.110174 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l8qs" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.142267 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l8qs"] Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.146589 4846 scope.go:117] "RemoveContainer" containerID="45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.147852 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2l8qs"] Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.174966 4846 scope.go:117] "RemoveContainer" containerID="4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.199587 4846 scope.go:117] "RemoveContainer" containerID="977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4" Nov 22 09:28:00 crc kubenswrapper[4846]: E1122 09:28:00.200131 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4\": container with ID starting with 977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4 not found: ID does not exist" containerID="977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.200186 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4"} err="failed to get container status \"977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4\": rpc error: code = NotFound desc = could not find container \"977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4\": container with ID starting with 977847831cbd69ff87b80227a3f980986c76d5d99075a9324a06a0c07030afd4 not found: ID does not exist" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.200230 4846 scope.go:117] "RemoveContainer" containerID="45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b" Nov 22 09:28:00 crc kubenswrapper[4846]: E1122 09:28:00.200644 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b\": container with ID starting with 45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b not found: ID does not exist" containerID="45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.200682 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b"} err="failed to get container status \"45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b\": rpc error: code = NotFound desc = could not find container \"45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b\": container with ID starting with 45ae3f9f6e25a4022f8f597a748d36521e44fdc62bedb085773fff1671ea2d5b not found: ID does not exist" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.200710 4846 scope.go:117] "RemoveContainer" containerID="4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6" Nov 22 09:28:00 crc kubenswrapper[4846]: E1122 09:28:00.201022 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6\": container with ID starting with 4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6 not found: ID does not exist" containerID="4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6" Nov 22 09:28:00 crc kubenswrapper[4846]: I1122 09:28:00.201062 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6"} err="failed to get container status \"4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6\": rpc error: code = NotFound desc = could not find container \"4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6\": container with ID starting with 4cc0e092bd44bbe039cba0a1db9ab5e7c1352524b430fdd72c22c49ac18135c6 not found: ID does not exist" Nov 22 09:28:02 crc kubenswrapper[4846]: I1122 09:28:02.049210 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" path="/var/lib/kubelet/pods/6e455fe0-cf47-4202-a76e-08b4ddc90606/volumes" Nov 22 09:28:02 crc kubenswrapper[4846]: I1122 09:28:02.545719 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:28:02 crc kubenswrapper[4846]: I1122 09:28:02.546441 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:28:02 crc kubenswrapper[4846]: I1122 09:28:02.578313 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:28:02 crc kubenswrapper[4846]: I1122 09:28:02.669350 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4btm7" Nov 22 09:28:03 crc kubenswrapper[4846]: I1122 09:28:03.172700 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-bh6tp" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.004343 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x"] Nov 22 09:28:05 crc kubenswrapper[4846]: E1122 09:28:05.004624 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="registry-server" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.004638 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="registry-server" Nov 22 09:28:05 crc kubenswrapper[4846]: E1122 09:28:05.004659 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="extract-content" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.004668 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="extract-content" Nov 22 09:28:05 crc kubenswrapper[4846]: E1122 09:28:05.004686 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="extract-utilities" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.004693 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="extract-utilities" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.004828 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e455fe0-cf47-4202-a76e-08b4ddc90606" containerName="registry-server" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.005773 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.007567 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vcpwn" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.022556 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x"] Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.195130 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-bundle\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.195554 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5m5d\" (UniqueName: \"kubernetes.io/projected/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-kube-api-access-j5m5d\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.195636 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-util\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.296945 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-util\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.297085 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-bundle\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.297128 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5m5d\" (UniqueName: \"kubernetes.io/projected/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-kube-api-access-j5m5d\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.297584 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-util\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.298117 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-bundle\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.336341 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5m5d\" (UniqueName: \"kubernetes.io/projected/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-kube-api-access-j5m5d\") pod \"473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:05 crc kubenswrapper[4846]: I1122 09:28:05.623502 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:06 crc kubenswrapper[4846]: I1122 09:28:06.076882 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x"] Nov 22 09:28:06 crc kubenswrapper[4846]: I1122 09:28:06.162578 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" event={"ID":"be2f7b8e-cdfc-4405-a4a7-9d835a12da05","Type":"ContainerStarted","Data":"6751aa3945c8d2bd89169abb6c0648727afd95a6e274807f3cd7470d0ccc9c0f"} Nov 22 09:28:07 crc kubenswrapper[4846]: I1122 09:28:07.170767 4846 generic.go:334] "Generic (PLEG): container finished" podID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerID="2ae1a7e4afdf8cf6a9062625bc422fa3c50b3e8e7870f1929fab2a7e4f51740f" exitCode=0 Nov 22 09:28:07 crc kubenswrapper[4846]: I1122 09:28:07.170826 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" event={"ID":"be2f7b8e-cdfc-4405-a4a7-9d835a12da05","Type":"ContainerDied","Data":"2ae1a7e4afdf8cf6a9062625bc422fa3c50b3e8e7870f1929fab2a7e4f51740f"} Nov 22 09:28:08 crc kubenswrapper[4846]: I1122 09:28:08.179663 4846 generic.go:334] "Generic (PLEG): container finished" podID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerID="6d78e974d6396b4d19c48ba022cac1baa7005289ad57bcf8e370ec5134e1c3bd" exitCode=0 Nov 22 09:28:08 crc kubenswrapper[4846]: I1122 09:28:08.179778 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" event={"ID":"be2f7b8e-cdfc-4405-a4a7-9d835a12da05","Type":"ContainerDied","Data":"6d78e974d6396b4d19c48ba022cac1baa7005289ad57bcf8e370ec5134e1c3bd"} Nov 22 09:28:09 crc kubenswrapper[4846]: I1122 09:28:09.191768 4846 generic.go:334] "Generic (PLEG): container finished" podID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerID="3f8edfa9371ee48251b9a01604412798436083995180b4421de82fe80a6ba50c" exitCode=0 Nov 22 09:28:09 crc kubenswrapper[4846]: I1122 09:28:09.191829 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" event={"ID":"be2f7b8e-cdfc-4405-a4a7-9d835a12da05","Type":"ContainerDied","Data":"3f8edfa9371ee48251b9a01604412798436083995180b4421de82fe80a6ba50c"} Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.563364 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.677447 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-util\") pod \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.677684 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-bundle\") pod \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.677745 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5m5d\" (UniqueName: \"kubernetes.io/projected/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-kube-api-access-j5m5d\") pod \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\" (UID: \"be2f7b8e-cdfc-4405-a4a7-9d835a12da05\") " Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.679308 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-bundle" (OuterVolumeSpecName: "bundle") pod "be2f7b8e-cdfc-4405-a4a7-9d835a12da05" (UID: "be2f7b8e-cdfc-4405-a4a7-9d835a12da05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.696287 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-kube-api-access-j5m5d" (OuterVolumeSpecName: "kube-api-access-j5m5d") pod "be2f7b8e-cdfc-4405-a4a7-9d835a12da05" (UID: "be2f7b8e-cdfc-4405-a4a7-9d835a12da05"). InnerVolumeSpecName "kube-api-access-j5m5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.698600 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-util" (OuterVolumeSpecName: "util") pod "be2f7b8e-cdfc-4405-a4a7-9d835a12da05" (UID: "be2f7b8e-cdfc-4405-a4a7-9d835a12da05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.779773 4846 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.779827 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5m5d\" (UniqueName: \"kubernetes.io/projected/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-kube-api-access-j5m5d\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:10 crc kubenswrapper[4846]: I1122 09:28:10.779843 4846 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be2f7b8e-cdfc-4405-a4a7-9d835a12da05-util\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:11 crc kubenswrapper[4846]: I1122 09:28:11.217712 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" event={"ID":"be2f7b8e-cdfc-4405-a4a7-9d835a12da05","Type":"ContainerDied","Data":"6751aa3945c8d2bd89169abb6c0648727afd95a6e274807f3cd7470d0ccc9c0f"} Nov 22 09:28:11 crc kubenswrapper[4846]: I1122 09:28:11.217761 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6751aa3945c8d2bd89169abb6c0648727afd95a6e274807f3cd7470d0ccc9c0f" Nov 22 09:28:11 crc kubenswrapper[4846]: I1122 09:28:11.217787 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.710047 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr"] Nov 22 09:28:13 crc kubenswrapper[4846]: E1122 09:28:13.710844 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="extract" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.710862 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="extract" Nov 22 09:28:13 crc kubenswrapper[4846]: E1122 09:28:13.710892 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="pull" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.710901 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="pull" Nov 22 09:28:13 crc kubenswrapper[4846]: E1122 09:28:13.710923 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="util" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.710931 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="util" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.711117 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2f7b8e-cdfc-4405-a4a7-9d835a12da05" containerName="extract" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.711892 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.739332 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-zt6kc" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.747953 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr"] Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.828914 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhkn\" (UniqueName: \"kubernetes.io/projected/d1b0081a-5f33-484d-8250-9ec2ab872b64-kube-api-access-5jhkn\") pod \"openstack-operator-controller-operator-559dfbff4-8cpxr\" (UID: \"d1b0081a-5f33-484d-8250-9ec2ab872b64\") " pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.930006 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhkn\" (UniqueName: \"kubernetes.io/projected/d1b0081a-5f33-484d-8250-9ec2ab872b64-kube-api-access-5jhkn\") pod \"openstack-operator-controller-operator-559dfbff4-8cpxr\" (UID: \"d1b0081a-5f33-484d-8250-9ec2ab872b64\") " pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:13 crc kubenswrapper[4846]: I1122 09:28:13.957549 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhkn\" (UniqueName: \"kubernetes.io/projected/d1b0081a-5f33-484d-8250-9ec2ab872b64-kube-api-access-5jhkn\") pod \"openstack-operator-controller-operator-559dfbff4-8cpxr\" (UID: \"d1b0081a-5f33-484d-8250-9ec2ab872b64\") " pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:14 crc kubenswrapper[4846]: I1122 09:28:14.053893 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:14 crc kubenswrapper[4846]: I1122 09:28:14.310966 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr"] Nov 22 09:28:15 crc kubenswrapper[4846]: I1122 09:28:15.256498 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" event={"ID":"d1b0081a-5f33-484d-8250-9ec2ab872b64","Type":"ContainerStarted","Data":"edf5e4a8c6e98c442dfc2d91f34efff0d278c7dac76c6c7b1a652310818a3d2b"} Nov 22 09:28:19 crc kubenswrapper[4846]: I1122 09:28:19.305185 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" event={"ID":"d1b0081a-5f33-484d-8250-9ec2ab872b64","Type":"ContainerStarted","Data":"581eae6980408799f82f0785c6ad59a567c66d659a5921d8bf172e3b80c1ddd5"} Nov 22 09:28:22 crc kubenswrapper[4846]: I1122 09:28:22.327977 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" event={"ID":"d1b0081a-5f33-484d-8250-9ec2ab872b64","Type":"ContainerStarted","Data":"a3de4e61e76e96e470cbf1a20179a017c416354120407837ab26652da840b8d5"} Nov 22 09:28:22 crc kubenswrapper[4846]: I1122 09:28:22.363515 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" podStartSLOduration=2.154005291 podStartE2EDuration="9.363491474s" podCreationTimestamp="2025-11-22 09:28:13 +0000 UTC" firstStartedPulling="2025-11-22 09:28:14.324032695 +0000 UTC m=+869.259722344" lastFinishedPulling="2025-11-22 09:28:21.533518878 +0000 UTC m=+876.469208527" observedRunningTime="2025-11-22 09:28:22.359077256 +0000 UTC m=+877.294766895" watchObservedRunningTime="2025-11-22 09:28:22.363491474 +0000 UTC m=+877.299181123" Nov 22 09:28:23 crc kubenswrapper[4846]: I1122 09:28:23.336446 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:23 crc kubenswrapper[4846]: I1122 09:28:23.352455 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-559dfbff4-8cpxr" Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.974779 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd"] Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.976723 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.981302 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-j77tm" Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.982437 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2"] Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.983874 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.986601 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-sxgm4" Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.991445 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd"] Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.995210 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2"] Nov 22 09:28:40 crc kubenswrapper[4846]: I1122 09:28:40.999888 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.001153 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.013630 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hlzsg" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.019181 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.059165 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.060673 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.067112 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-95c5l" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.081950 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.083205 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.089089 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xplts" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.098573 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.113979 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.162759 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cm4l\" (UniqueName: \"kubernetes.io/projected/01860b24-58b0-422d-a390-fc783a2f4990-kube-api-access-8cm4l\") pod \"barbican-operator-controller-manager-75fb479bcc-qcqgd\" (UID: \"01860b24-58b0-422d-a390-fc783a2f4990\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.163181 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmmcw\" (UniqueName: \"kubernetes.io/projected/f1008fc2-d21a-4775-8505-12116c0a1d94-kube-api-access-fmmcw\") pod \"designate-operator-controller-manager-767ccfd65f-rb6cp\" (UID: \"f1008fc2-d21a-4775-8505-12116c0a1d94\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.163353 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj58n\" (UniqueName: \"kubernetes.io/projected/371bad3e-fcc3-42c5-a563-fc7d6aa5f275-kube-api-access-tj58n\") pod \"cinder-operator-controller-manager-6498cbf48f-r4xp2\" (UID: \"371bad3e-fcc3-42c5-a563-fc7d6aa5f275\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.199269 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.205659 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t68nm" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.208218 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.227116 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.238847 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.240336 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.242849 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.252335 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.256502 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jw84c" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.279843 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmmcw\" (UniqueName: \"kubernetes.io/projected/f1008fc2-d21a-4775-8505-12116c0a1d94-kube-api-access-fmmcw\") pod \"designate-operator-controller-manager-767ccfd65f-rb6cp\" (UID: \"f1008fc2-d21a-4775-8505-12116c0a1d94\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.279924 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r565p\" (UniqueName: \"kubernetes.io/projected/539f5169-bf3b-4c3c-828a-8490d4d758d8-kube-api-access-r565p\") pod \"heat-operator-controller-manager-56f54d6746-gqktx\" (UID: \"539f5169-bf3b-4c3c-828a-8490d4d758d8\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.279966 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmpk\" (UniqueName: \"kubernetes.io/projected/4dac3679-62ae-408f-b3ba-1809daaceb47-kube-api-access-2jmpk\") pod \"glance-operator-controller-manager-7969689c84-fwlcn\" (UID: \"4dac3679-62ae-408f-b3ba-1809daaceb47\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.279993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj58n\" (UniqueName: \"kubernetes.io/projected/371bad3e-fcc3-42c5-a563-fc7d6aa5f275-kube-api-access-tj58n\") pod \"cinder-operator-controller-manager-6498cbf48f-r4xp2\" (UID: \"371bad3e-fcc3-42c5-a563-fc7d6aa5f275\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.280023 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cm4l\" (UniqueName: \"kubernetes.io/projected/01860b24-58b0-422d-a390-fc783a2f4990-kube-api-access-8cm4l\") pod \"barbican-operator-controller-manager-75fb479bcc-qcqgd\" (UID: \"01860b24-58b0-422d-a390-fc783a2f4990\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.294671 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.295999 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.305666 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.308483 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xv2rp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.316607 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.316893 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.327389 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.331212 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmmcw\" (UniqueName: \"kubernetes.io/projected/f1008fc2-d21a-4775-8505-12116c0a1d94-kube-api-access-fmmcw\") pod \"designate-operator-controller-manager-767ccfd65f-rb6cp\" (UID: \"f1008fc2-d21a-4775-8505-12116c0a1d94\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.339021 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.345848 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cm4l\" (UniqueName: \"kubernetes.io/projected/01860b24-58b0-422d-a390-fc783a2f4990-kube-api-access-8cm4l\") pod \"barbican-operator-controller-manager-75fb479bcc-qcqgd\" (UID: \"01860b24-58b0-422d-a390-fc783a2f4990\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.346130 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-n86sj" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.348240 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.348487 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.349024 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.354873 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj58n\" (UniqueName: \"kubernetes.io/projected/371bad3e-fcc3-42c5-a563-fc7d6aa5f275-kube-api-access-tj58n\") pod \"cinder-operator-controller-manager-6498cbf48f-r4xp2\" (UID: \"371bad3e-fcc3-42c5-a563-fc7d6aa5f275\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.355348 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.357759 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jgh7p" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.358004 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zbrtc" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.359319 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.372376 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.382209 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.391155 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r565p\" (UniqueName: \"kubernetes.io/projected/539f5169-bf3b-4c3c-828a-8490d4d758d8-kube-api-access-r565p\") pod \"heat-operator-controller-manager-56f54d6746-gqktx\" (UID: \"539f5169-bf3b-4c3c-828a-8490d4d758d8\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.391207 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jmpk\" (UniqueName: \"kubernetes.io/projected/4dac3679-62ae-408f-b3ba-1809daaceb47-kube-api-access-2jmpk\") pod \"glance-operator-controller-manager-7969689c84-fwlcn\" (UID: \"4dac3679-62ae-408f-b3ba-1809daaceb47\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.391241 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxq5p\" (UniqueName: \"kubernetes.io/projected/c4abfa7d-5927-41f1-af53-bc1ea6878bc1-kube-api-access-xxq5p\") pod \"horizon-operator-controller-manager-598f69df5d-vt8xd\" (UID: \"c4abfa7d-5927-41f1-af53-bc1ea6878bc1\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.391277 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzpf\" (UniqueName: \"kubernetes.io/projected/f7cb339f-9ebe-441d-ae17-43ad2ce13201-kube-api-access-crzpf\") pod \"infra-operator-controller-manager-6ccc968f7b-dxpcq\" (UID: \"f7cb339f-9ebe-441d-ae17-43ad2ce13201\") " pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.391311 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7cb339f-9ebe-441d-ae17-43ad2ce13201-cert\") pod \"infra-operator-controller-manager-6ccc968f7b-dxpcq\" (UID: \"f7cb339f-9ebe-441d-ae17-43ad2ce13201\") " pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.395661 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.398508 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hv7cb" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.400703 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.419926 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.421017 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.434615 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f4d4z" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.437407 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.439949 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.449558 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zbn7w" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.461643 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jmpk\" (UniqueName: \"kubernetes.io/projected/4dac3679-62ae-408f-b3ba-1809daaceb47-kube-api-access-2jmpk\") pod \"glance-operator-controller-manager-7969689c84-fwlcn\" (UID: \"4dac3679-62ae-408f-b3ba-1809daaceb47\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.480726 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r565p\" (UniqueName: \"kubernetes.io/projected/539f5169-bf3b-4c3c-828a-8490d4d758d8-kube-api-access-r565p\") pod \"heat-operator-controller-manager-56f54d6746-gqktx\" (UID: \"539f5169-bf3b-4c3c-828a-8490d4d758d8\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.497408 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.500981 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zgx\" (UniqueName: \"kubernetes.io/projected/07d22ef0-2712-4daf-a620-081fee41f68f-kube-api-access-c8zgx\") pod \"neutron-operator-controller-manager-78bd47f458-jwx4v\" (UID: \"07d22ef0-2712-4daf-a620-081fee41f68f\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501028 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjqq\" (UniqueName: \"kubernetes.io/projected/facf2ae5-028f-4413-a2d6-e503489ae5f3-kube-api-access-4cjqq\") pod \"octavia-operator-controller-manager-54cfbf4c7d-9jbkf\" (UID: \"facf2ae5-028f-4413-a2d6-e503489ae5f3\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501069 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjpj\" (UniqueName: \"kubernetes.io/projected/ab7af809-056a-45c1-bdd0-5e4a8bea02ef-kube-api-access-tzjpj\") pod \"keystone-operator-controller-manager-7454b96578-ltkkr\" (UID: \"ab7af809-056a-45c1-bdd0-5e4a8bea02ef\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501107 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxq5p\" (UniqueName: \"kubernetes.io/projected/c4abfa7d-5927-41f1-af53-bc1ea6878bc1-kube-api-access-xxq5p\") pod \"horizon-operator-controller-manager-598f69df5d-vt8xd\" (UID: \"c4abfa7d-5927-41f1-af53-bc1ea6878bc1\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501132 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275z8\" (UniqueName: \"kubernetes.io/projected/e84b7960-5cd2-4557-9b3c-a98ed4784006-kube-api-access-275z8\") pod \"manila-operator-controller-manager-58f887965d-c8ssm\" (UID: \"e84b7960-5cd2-4557-9b3c-a98ed4784006\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501149 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7fj\" (UniqueName: \"kubernetes.io/projected/22e226e0-ebce-4d63-9379-109fe06b88da-kube-api-access-9n7fj\") pod \"ironic-operator-controller-manager-99b499f4-9wwm2\" (UID: \"22e226e0-ebce-4d63-9379-109fe06b88da\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501165 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7bgs\" (UniqueName: \"kubernetes.io/projected/766c68ab-9022-4efd-84a3-af4aedf7d7b2-kube-api-access-v7bgs\") pod \"nova-operator-controller-manager-cfbb9c588-t8nfb\" (UID: \"766c68ab-9022-4efd-84a3-af4aedf7d7b2\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501187 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxlg\" (UniqueName: \"kubernetes.io/projected/f4a50a36-b951-4342-b092-c94bea3d860e-kube-api-access-zrxlg\") pod \"mariadb-operator-controller-manager-54b5986bb8-jbjc4\" (UID: \"f4a50a36-b951-4342-b092-c94bea3d860e\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501218 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crzpf\" (UniqueName: \"kubernetes.io/projected/f7cb339f-9ebe-441d-ae17-43ad2ce13201-kube-api-access-crzpf\") pod \"infra-operator-controller-manager-6ccc968f7b-dxpcq\" (UID: \"f7cb339f-9ebe-441d-ae17-43ad2ce13201\") " pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.501254 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7cb339f-9ebe-441d-ae17-43ad2ce13201-cert\") pod \"infra-operator-controller-manager-6ccc968f7b-dxpcq\" (UID: \"f7cb339f-9ebe-441d-ae17-43ad2ce13201\") " pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.504873 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7cb339f-9ebe-441d-ae17-43ad2ce13201-cert\") pod \"infra-operator-controller-manager-6ccc968f7b-dxpcq\" (UID: \"f7cb339f-9ebe-441d-ae17-43ad2ce13201\") " pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.532372 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.560885 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzpf\" (UniqueName: \"kubernetes.io/projected/f7cb339f-9ebe-441d-ae17-43ad2ce13201-kube-api-access-crzpf\") pod \"infra-operator-controller-manager-6ccc968f7b-dxpcq\" (UID: \"f7cb339f-9ebe-441d-ae17-43ad2ce13201\") " pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.565767 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.567737 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.574471 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.575970 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.576203 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxq5p\" (UniqueName: \"kubernetes.io/projected/c4abfa7d-5927-41f1-af53-bc1ea6878bc1-kube-api-access-xxq5p\") pod \"horizon-operator-controller-manager-598f69df5d-vt8xd\" (UID: \"c4abfa7d-5927-41f1-af53-bc1ea6878bc1\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.587101 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-8qfdx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.587364 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.587553 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-lsdpn" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603531 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zgx\" (UniqueName: \"kubernetes.io/projected/07d22ef0-2712-4daf-a620-081fee41f68f-kube-api-access-c8zgx\") pod \"neutron-operator-controller-manager-78bd47f458-jwx4v\" (UID: \"07d22ef0-2712-4daf-a620-081fee41f68f\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603586 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjqq\" (UniqueName: \"kubernetes.io/projected/facf2ae5-028f-4413-a2d6-e503489ae5f3-kube-api-access-4cjqq\") pod \"octavia-operator-controller-manager-54cfbf4c7d-9jbkf\" (UID: \"facf2ae5-028f-4413-a2d6-e503489ae5f3\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603611 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjpj\" (UniqueName: \"kubernetes.io/projected/ab7af809-056a-45c1-bdd0-5e4a8bea02ef-kube-api-access-tzjpj\") pod \"keystone-operator-controller-manager-7454b96578-ltkkr\" (UID: \"ab7af809-056a-45c1-bdd0-5e4a8bea02ef\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603646 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275z8\" (UniqueName: \"kubernetes.io/projected/e84b7960-5cd2-4557-9b3c-a98ed4784006-kube-api-access-275z8\") pod \"manila-operator-controller-manager-58f887965d-c8ssm\" (UID: \"e84b7960-5cd2-4557-9b3c-a98ed4784006\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603662 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7fj\" (UniqueName: \"kubernetes.io/projected/22e226e0-ebce-4d63-9379-109fe06b88da-kube-api-access-9n7fj\") pod \"ironic-operator-controller-manager-99b499f4-9wwm2\" (UID: \"22e226e0-ebce-4d63-9379-109fe06b88da\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603681 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7bgs\" (UniqueName: \"kubernetes.io/projected/766c68ab-9022-4efd-84a3-af4aedf7d7b2-kube-api-access-v7bgs\") pod \"nova-operator-controller-manager-cfbb9c588-t8nfb\" (UID: \"766c68ab-9022-4efd-84a3-af4aedf7d7b2\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.603699 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxlg\" (UniqueName: \"kubernetes.io/projected/f4a50a36-b951-4342-b092-c94bea3d860e-kube-api-access-zrxlg\") pod \"mariadb-operator-controller-manager-54b5986bb8-jbjc4\" (UID: \"f4a50a36-b951-4342-b092-c94bea3d860e\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.620681 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.627699 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.631280 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.635560 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.637856 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.641253 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.645745 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275z8\" (UniqueName: \"kubernetes.io/projected/e84b7960-5cd2-4557-9b3c-a98ed4784006-kube-api-access-275z8\") pod \"manila-operator-controller-manager-58f887965d-c8ssm\" (UID: \"e84b7960-5cd2-4557-9b3c-a98ed4784006\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.650647 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjqq\" (UniqueName: \"kubernetes.io/projected/facf2ae5-028f-4413-a2d6-e503489ae5f3-kube-api-access-4cjqq\") pod \"octavia-operator-controller-manager-54cfbf4c7d-9jbkf\" (UID: \"facf2ae5-028f-4413-a2d6-e503489ae5f3\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.651376 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mtjsv" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.651596 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.652847 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.657878 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxlg\" (UniqueName: \"kubernetes.io/projected/f4a50a36-b951-4342-b092-c94bea3d860e-kube-api-access-zrxlg\") pod \"mariadb-operator-controller-manager-54b5986bb8-jbjc4\" (UID: \"f4a50a36-b951-4342-b092-c94bea3d860e\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.661342 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.667440 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hsthg" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.727901 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7fj\" (UniqueName: \"kubernetes.io/projected/22e226e0-ebce-4d63-9379-109fe06b88da-kube-api-access-9n7fj\") pod \"ironic-operator-controller-manager-99b499f4-9wwm2\" (UID: \"22e226e0-ebce-4d63-9379-109fe06b88da\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.730477 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.731986 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7bgs\" (UniqueName: \"kubernetes.io/projected/766c68ab-9022-4efd-84a3-af4aedf7d7b2-kube-api-access-v7bgs\") pod \"nova-operator-controller-manager-cfbb9c588-t8nfb\" (UID: \"766c68ab-9022-4efd-84a3-af4aedf7d7b2\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.732086 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.737500 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.739772 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.742754 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57jnv\" (UniqueName: \"kubernetes.io/projected/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-kube-api-access-57jnv\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.743099 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.743440 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgrc6\" (UniqueName: \"kubernetes.io/projected/671de1b8-d3f3-4a1e-8572-e2840bf58e17-kube-api-access-xgrc6\") pod \"placement-operator-controller-manager-5b797b8dff-7lg8k\" (UID: \"671de1b8-d3f3-4a1e-8572-e2840bf58e17\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.743515 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xqzt\" (UniqueName: \"kubernetes.io/projected/113bd687-6dff-4159-b034-3a27a0683260-kube-api-access-4xqzt\") pod \"swift-operator-controller-manager-d656998f4-cnhgp\" (UID: \"113bd687-6dff-4159-b034-3a27a0683260\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.743627 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pszpt\" (UniqueName: \"kubernetes.io/projected/5454b9eb-3a18-47d6-ba8e-1b7230659b26-kube-api-access-pszpt\") pod \"ovn-operator-controller-manager-54fc5f65b7-458hx\" (UID: \"5454b9eb-3a18-47d6-ba8e-1b7230659b26\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.752463 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zgx\" (UniqueName: \"kubernetes.io/projected/07d22ef0-2712-4daf-a620-081fee41f68f-kube-api-access-c8zgx\") pod \"neutron-operator-controller-manager-78bd47f458-jwx4v\" (UID: \"07d22ef0-2712-4daf-a620-081fee41f68f\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.782486 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjpj\" (UniqueName: \"kubernetes.io/projected/ab7af809-056a-45c1-bdd0-5e4a8bea02ef-kube-api-access-tzjpj\") pod \"keystone-operator-controller-manager-7454b96578-ltkkr\" (UID: \"ab7af809-056a-45c1-bdd0-5e4a8bea02ef\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.784911 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.788015 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.798849 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-644f7"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.803106 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.804358 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-f4v6g" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.811251 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-56l49" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.828541 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.834739 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.840939 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.862805 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865433 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57jnv\" (UniqueName: \"kubernetes.io/projected/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-kube-api-access-57jnv\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865504 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrc6\" (UniqueName: \"kubernetes.io/projected/671de1b8-d3f3-4a1e-8572-e2840bf58e17-kube-api-access-xgrc6\") pod \"placement-operator-controller-manager-5b797b8dff-7lg8k\" (UID: \"671de1b8-d3f3-4a1e-8572-e2840bf58e17\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865538 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865587 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlrz\" (UniqueName: \"kubernetes.io/projected/63f03060-74f5-437f-bb06-a2626c791a06-kube-api-access-wqlrz\") pod \"telemetry-operator-controller-manager-6d4bf84b58-qbwl5\" (UID: \"63f03060-74f5-437f-bb06-a2626c791a06\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865618 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xqzt\" (UniqueName: \"kubernetes.io/projected/113bd687-6dff-4159-b034-3a27a0683260-kube-api-access-4xqzt\") pod \"swift-operator-controller-manager-d656998f4-cnhgp\" (UID: \"113bd687-6dff-4159-b034-3a27a0683260\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865689 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pszpt\" (UniqueName: \"kubernetes.io/projected/5454b9eb-3a18-47d6-ba8e-1b7230659b26-kube-api-access-pszpt\") pod \"ovn-operator-controller-manager-54fc5f65b7-458hx\" (UID: \"5454b9eb-3a18-47d6-ba8e-1b7230659b26\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.865739 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb2lj\" (UniqueName: \"kubernetes.io/projected/8cec188d-f264-4a62-96f1-93e309820fe6-kube-api-access-jb2lj\") pod \"test-operator-controller-manager-b4c496f69-644f7\" (UID: \"8cec188d-f264-4a62-96f1-93e309820fe6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:28:41 crc kubenswrapper[4846]: E1122 09:28:41.866628 4846 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 09:28:41 crc kubenswrapper[4846]: E1122 09:28:41.866710 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-cert podName:eaacbd1d-48b7-40a3-b7e4-48fc074e37fb nodeName:}" failed. No retries permitted until 2025-11-22 09:28:42.366686838 +0000 UTC m=+897.302376487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-thll2" (UID: "eaacbd1d-48b7-40a3-b7e4-48fc074e37fb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.875894 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-644f7"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.893350 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.913452 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.923813 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57jnv\" (UniqueName: \"kubernetes.io/projected/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-kube-api-access-57jnv\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.927640 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.932106 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.935256 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgrc6\" (UniqueName: \"kubernetes.io/projected/671de1b8-d3f3-4a1e-8572-e2840bf58e17-kube-api-access-xgrc6\") pod \"placement-operator-controller-manager-5b797b8dff-7lg8k\" (UID: \"671de1b8-d3f3-4a1e-8572-e2840bf58e17\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.936126 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xqzt\" (UniqueName: \"kubernetes.io/projected/113bd687-6dff-4159-b034-3a27a0683260-kube-api-access-4xqzt\") pod \"swift-operator-controller-manager-d656998f4-cnhgp\" (UID: \"113bd687-6dff-4159-b034-3a27a0683260\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.949120 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.952276 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.956286 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pszpt\" (UniqueName: \"kubernetes.io/projected/5454b9eb-3a18-47d6-ba8e-1b7230659b26-kube-api-access-pszpt\") pod \"ovn-operator-controller-manager-54fc5f65b7-458hx\" (UID: \"5454b9eb-3a18-47d6-ba8e-1b7230659b26\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.959921 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5lpx9" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.962955 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj"] Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.983452 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlrz\" (UniqueName: \"kubernetes.io/projected/63f03060-74f5-437f-bb06-a2626c791a06-kube-api-access-wqlrz\") pod \"telemetry-operator-controller-manager-6d4bf84b58-qbwl5\" (UID: \"63f03060-74f5-437f-bb06-a2626c791a06\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:28:41 crc kubenswrapper[4846]: I1122 09:28:41.983520 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb2lj\" (UniqueName: \"kubernetes.io/projected/8cec188d-f264-4a62-96f1-93e309820fe6-kube-api-access-jb2lj\") pod \"test-operator-controller-manager-b4c496f69-644f7\" (UID: \"8cec188d-f264-4a62-96f1-93e309820fe6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.035684 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlrz\" (UniqueName: \"kubernetes.io/projected/63f03060-74f5-437f-bb06-a2626c791a06-kube-api-access-wqlrz\") pod \"telemetry-operator-controller-manager-6d4bf84b58-qbwl5\" (UID: \"63f03060-74f5-437f-bb06-a2626c791a06\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.038172 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.049224 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb2lj\" (UniqueName: \"kubernetes.io/projected/8cec188d-f264-4a62-96f1-93e309820fe6-kube-api-access-jb2lj\") pod \"test-operator-controller-manager-b4c496f69-644f7\" (UID: \"8cec188d-f264-4a62-96f1-93e309820fe6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.071785 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.087021 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4l8t\" (UniqueName: \"kubernetes.io/projected/895d9e5d-08de-4611-a844-c2db9e8e1839-kube-api-access-s4l8t\") pod \"watcher-operator-controller-manager-8c6448b9f-7f7qj\" (UID: \"895d9e5d-08de-4611-a844-c2db9e8e1839\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.093853 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.096324 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.096357 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.096578 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.097372 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.121985 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.122085 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.122580 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-76xsc" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.122726 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-h7gw5" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.130762 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.139545 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.169098 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.190180 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4l8t\" (UniqueName: \"kubernetes.io/projected/895d9e5d-08de-4611-a844-c2db9e8e1839-kube-api-access-s4l8t\") pod \"watcher-operator-controller-manager-8c6448b9f-7f7qj\" (UID: \"895d9e5d-08de-4611-a844-c2db9e8e1839\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.198860 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.211971 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4l8t\" (UniqueName: \"kubernetes.io/projected/895d9e5d-08de-4611-a844-c2db9e8e1839-kube-api-access-s4l8t\") pod \"watcher-operator-controller-manager-8c6448b9f-7f7qj\" (UID: \"895d9e5d-08de-4611-a844-c2db9e8e1839\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.222697 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.292123 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptkw\" (UniqueName: \"kubernetes.io/projected/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-kube-api-access-jptkw\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.292196 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgv7\" (UniqueName: \"kubernetes.io/projected/f6c75a18-6338-4da3-8b61-a973a8589e66-kube-api-access-gsgv7\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7v42k\" (UID: \"f6c75a18-6338-4da3-8b61-a973a8589e66\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.292251 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-cert\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.307181 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.397319 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-cert\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.397901 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.397971 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jptkw\" (UniqueName: \"kubernetes.io/projected/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-kube-api-access-jptkw\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.398030 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgv7\" (UniqueName: \"kubernetes.io/projected/f6c75a18-6338-4da3-8b61-a973a8589e66-kube-api-access-gsgv7\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7v42k\" (UID: \"f6c75a18-6338-4da3-8b61-a973a8589e66\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" Nov 22 09:28:42 crc kubenswrapper[4846]: E1122 09:28:42.399302 4846 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 22 09:28:42 crc kubenswrapper[4846]: E1122 09:28:42.399396 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-cert podName:7fa6485e-01f3-43e7-ac4e-f639cd3983d5 nodeName:}" failed. No retries permitted until 2025-11-22 09:28:42.899372818 +0000 UTC m=+897.835062527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-cert") pod "openstack-operator-controller-manager-67485f68cb-z25cl" (UID: "7fa6485e-01f3-43e7-ac4e-f639cd3983d5") : secret "webhook-server-cert" not found Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.411371 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eaacbd1d-48b7-40a3-b7e4-48fc074e37fb-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-thll2\" (UID: \"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.431584 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptkw\" (UniqueName: \"kubernetes.io/projected/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-kube-api-access-jptkw\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.435083 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgv7\" (UniqueName: \"kubernetes.io/projected/f6c75a18-6338-4da3-8b61-a973a8589e66-kube-api-access-gsgv7\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7v42k\" (UID: \"f6c75a18-6338-4da3-8b61-a973a8589e66\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.491923 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" event={"ID":"f1008fc2-d21a-4775-8505-12116c0a1d94","Type":"ContainerStarted","Data":"0dd5048f8916350d76a627bed9b99f2c9ee597fa9be49b384871ec3b9d2c0688"} Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.504207 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.571251 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.599389 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.620373 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.682447 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.775128 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.798620 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.829444 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn"] Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.914719 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-cert\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: I1122 09:28:42.922640 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fa6485e-01f3-43e7-ac4e-f639cd3983d5-cert\") pod \"openstack-operator-controller-manager-67485f68cb-z25cl\" (UID: \"7fa6485e-01f3-43e7-ac4e-f639cd3983d5\") " pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:42 crc kubenswrapper[4846]: W1122 09:28:42.923150 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dac3679_62ae_408f_b3ba_1809daaceb47.slice/crio-911e5981a02d2d3bc80c9d3ee442dcf3cbcc0b7a3fcbdb5079dba677bbbf9278 WatchSource:0}: Error finding container 911e5981a02d2d3bc80c9d3ee442dcf3cbcc0b7a3fcbdb5079dba677bbbf9278: Status 404 returned error can't find the container with id 911e5981a02d2d3bc80c9d3ee442dcf3cbcc0b7a3fcbdb5079dba677bbbf9278 Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.071865 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.221416 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.223937 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.276369 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.282564 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd"] Nov 22 09:28:43 crc kubenswrapper[4846]: W1122 09:28:43.293238 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d22ef0_2712_4daf_a620_081fee41f68f.slice/crio-3221730316dbabbb5d55f5a9b9ff7132b7c017f8840f16fca0b8ad09e67e443c WatchSource:0}: Error finding container 3221730316dbabbb5d55f5a9b9ff7132b7c017f8840f16fca0b8ad09e67e443c: Status 404 returned error can't find the container with id 3221730316dbabbb5d55f5a9b9ff7132b7c017f8840f16fca0b8ad09e67e443c Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.504525 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" event={"ID":"539f5169-bf3b-4c3c-828a-8490d4d758d8","Type":"ContainerStarted","Data":"799f8849f10c7c23a6ca1960f18ac18e12aa1694f860b9d5b9bfe001bb5cb9b4"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.506908 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" event={"ID":"f7cb339f-9ebe-441d-ae17-43ad2ce13201","Type":"ContainerStarted","Data":"7130767eab646b51e57190153e52e81feefb205cad66e18efca26d0a2f13a38c"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.508344 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" event={"ID":"766c68ab-9022-4efd-84a3-af4aedf7d7b2","Type":"ContainerStarted","Data":"c52a1e0eb6e6f545a94eeb9df738df2be27a0266201bd5a5d5e14da0bee24c85"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.511503 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" event={"ID":"371bad3e-fcc3-42c5-a563-fc7d6aa5f275","Type":"ContainerStarted","Data":"69fc7883cce2ab6b998b259b2985ab1da16472b7a218a17dde99dd011b63695b"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.517601 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" event={"ID":"01860b24-58b0-422d-a390-fc783a2f4990","Type":"ContainerStarted","Data":"1d14ac3dde6a3faa439004977c90fd87311ec59afc788e309d44e6d60441d023"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.519282 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" event={"ID":"4dac3679-62ae-408f-b3ba-1809daaceb47","Type":"ContainerStarted","Data":"911e5981a02d2d3bc80c9d3ee442dcf3cbcc0b7a3fcbdb5079dba677bbbf9278"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.520572 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" event={"ID":"c4abfa7d-5927-41f1-af53-bc1ea6878bc1","Type":"ContainerStarted","Data":"5b129ba7e120e08d01065bd7fdd915791e0a5477164736dccde1f73987851609"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.521973 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" event={"ID":"e84b7960-5cd2-4557-9b3c-a98ed4784006","Type":"ContainerStarted","Data":"0f9072fc6a64f4f6475580c21c33d7b0cec6d3f056e19efc6fd79d76a0015377"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.524260 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" event={"ID":"07d22ef0-2712-4daf-a620-081fee41f68f","Type":"ContainerStarted","Data":"3221730316dbabbb5d55f5a9b9ff7132b7c017f8840f16fca0b8ad09e67e443c"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.526977 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" event={"ID":"facf2ae5-028f-4413-a2d6-e503489ae5f3","Type":"ContainerStarted","Data":"8fa96594fabf0910f32876c6b014f6560de77f9b86405b4bf3fe4eeda6f69253"} Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.528516 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.685256 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-644f7"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.706285 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2"] Nov 22 09:28:43 crc kubenswrapper[4846]: W1122 09:28:43.726200 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671de1b8_d3f3_4a1e_8572_e2840bf58e17.slice/crio-d49a7cd83733baa90aca07992ec3d2e84345fa37742be45a6ee3b7fa1a20d149 WatchSource:0}: Error finding container d49a7cd83733baa90aca07992ec3d2e84345fa37742be45a6ee3b7fa1a20d149: Status 404 returned error can't find the container with id d49a7cd83733baa90aca07992ec3d2e84345fa37742be45a6ee3b7fa1a20d149 Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.737062 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.748437 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.776738 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr"] Nov 22 09:28:43 crc kubenswrapper[4846]: E1122 09:28:43.780401 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tzjpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7454b96578-ltkkr_openstack-operators(ab7af809-056a-45c1-bdd0-5e4a8bea02ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 09:28:43 crc kubenswrapper[4846]: E1122 09:28:43.786712 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zrxlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-54b5986bb8-jbjc4_openstack-operators(f4a50a36-b951-4342-b092-c94bea3d860e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.813743 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.824772 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.830056 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.835978 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k"] Nov 22 09:28:43 crc kubenswrapper[4846]: W1122 09:28:43.836858 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113bd687_6dff_4159_b034_3a27a0683260.slice/crio-2221e9c6bb3536703641b0fcddec653b3b7fc1a286143eae00aae603e09ee82f WatchSource:0}: Error finding container 2221e9c6bb3536703641b0fcddec653b3b7fc1a286143eae00aae603e09ee82f: Status 404 returned error can't find the container with id 2221e9c6bb3536703641b0fcddec653b3b7fc1a286143eae00aae603e09ee82f Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.840072 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2"] Nov 22 09:28:43 crc kubenswrapper[4846]: I1122 09:28:43.843471 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl"] Nov 22 09:28:43 crc kubenswrapper[4846]: E1122 09:28:43.869458 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xqzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d656998f4-cnhgp_openstack-operators(113bd687-6dff-4159-b034-3a27a0683260): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 09:28:43 crc kubenswrapper[4846]: E1122 09:28:43.886495 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gsgv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-7v42k_openstack-operators(f6c75a18-6338-4da3-8b61-a973a8589e66): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 09:28:43 crc kubenswrapper[4846]: W1122 09:28:43.887568 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaacbd1d_48b7_40a3_b7e4_48fc074e37fb.slice/crio-ed0d25f7cb52d2dbf71d38da7898207836d666ce7fdeeabc15324c409aca3872 WatchSource:0}: Error finding container ed0d25f7cb52d2dbf71d38da7898207836d666ce7fdeeabc15324c409aca3872: Status 404 returned error can't find the container with id ed0d25f7cb52d2dbf71d38da7898207836d666ce7fdeeabc15324c409aca3872 Nov 22 09:28:43 crc kubenswrapper[4846]: E1122 09:28:43.887665 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" podUID="f6c75a18-6338-4da3-8b61-a973a8589e66" Nov 22 09:28:43 crc kubenswrapper[4846]: E1122 09:28:43.901018 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57jnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-8c7444f48-thll2_openstack-operators(eaacbd1d-48b7-40a3-b7e4-48fc074e37fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.242132 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" podUID="f4a50a36-b951-4342-b092-c94bea3d860e" Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.269255 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" podUID="ab7af809-056a-45c1-bdd0-5e4a8bea02ef" Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.273575 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" podUID="113bd687-6dff-4159-b034-3a27a0683260" Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.288283 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" podUID="eaacbd1d-48b7-40a3-b7e4-48fc074e37fb" Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.570488 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" event={"ID":"ab7af809-056a-45c1-bdd0-5e4a8bea02ef","Type":"ContainerStarted","Data":"bbfa7e898f92606401a8a41753b576f3dcb194eeca1bb764a1f4cb03f8474bb0"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.570544 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" event={"ID":"ab7af809-056a-45c1-bdd0-5e4a8bea02ef","Type":"ContainerStarted","Data":"b72aa56aabf71b163397ad63bf265e9c996b153f93d8721e0da5fb77b84f6bd3"} Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.572915 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" podUID="ab7af809-056a-45c1-bdd0-5e4a8bea02ef" Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.590652 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" event={"ID":"8cec188d-f264-4a62-96f1-93e309820fe6","Type":"ContainerStarted","Data":"21a1630dcdf21c3a53008fbd7e232b86311792c305eee3db56ca991f3648a877"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.592778 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" event={"ID":"22e226e0-ebce-4d63-9379-109fe06b88da","Type":"ContainerStarted","Data":"fd564191fb1c794edd73df0c4e496d39c8e142fc5757ac1f82904663e500f783"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.599179 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" event={"ID":"5454b9eb-3a18-47d6-ba8e-1b7230659b26","Type":"ContainerStarted","Data":"572ed3e6eda2720f59fb2b692fc145d7a1bb6a727cae139fe5418eaf337d4a53"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.600463 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" event={"ID":"f6c75a18-6338-4da3-8b61-a973a8589e66","Type":"ContainerStarted","Data":"d6439cf9c38cca5f836be08d20e65bf7b3e8fb57c205b9a98b68d63cbe46f9ea"} Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.602757 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" podUID="f6c75a18-6338-4da3-8b61-a973a8589e66" Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.603255 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" event={"ID":"113bd687-6dff-4159-b034-3a27a0683260","Type":"ContainerStarted","Data":"4c99e57affb24d3d368c438144988feee0225c945713dde6246fd9caafc28a61"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.603283 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" event={"ID":"113bd687-6dff-4159-b034-3a27a0683260","Type":"ContainerStarted","Data":"2221e9c6bb3536703641b0fcddec653b3b7fc1a286143eae00aae603e09ee82f"} Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.606520 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" podUID="113bd687-6dff-4159-b034-3a27a0683260" Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.607176 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" event={"ID":"895d9e5d-08de-4611-a844-c2db9e8e1839","Type":"ContainerStarted","Data":"89ece77ea0533154636e20c1c6b9e021ffd82771f813eafa4e7170acd74b78e6"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.608835 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" event={"ID":"63f03060-74f5-437f-bb06-a2626c791a06","Type":"ContainerStarted","Data":"7e54da2fb52f701fe825ca643e19d48ec230cc02de6ad5642ad004960275df9f"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.612469 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" event={"ID":"671de1b8-d3f3-4a1e-8572-e2840bf58e17","Type":"ContainerStarted","Data":"d49a7cd83733baa90aca07992ec3d2e84345fa37742be45a6ee3b7fa1a20d149"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.720361 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" event={"ID":"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb","Type":"ContainerStarted","Data":"dfdaa118d49c474cc2a1cad8590dc4857fc2b3a1eb64817f7aee1033680483c8"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.720414 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" event={"ID":"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb","Type":"ContainerStarted","Data":"ed0d25f7cb52d2dbf71d38da7898207836d666ce7fdeeabc15324c409aca3872"} Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.724963 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" podUID="eaacbd1d-48b7-40a3-b7e4-48fc074e37fb" Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.727151 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" event={"ID":"f4a50a36-b951-4342-b092-c94bea3d860e","Type":"ContainerStarted","Data":"e594f72bea79482eb8b28dfec03ef16734b0295651f98cdd1c6ddf50c57024f4"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.727199 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" event={"ID":"f4a50a36-b951-4342-b092-c94bea3d860e","Type":"ContainerStarted","Data":"5b741767a8971253eb1612e3afd03153947e901ce67c2226280c7d2cb31ffc83"} Nov 22 09:28:44 crc kubenswrapper[4846]: E1122 09:28:44.728827 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" podUID="f4a50a36-b951-4342-b092-c94bea3d860e" Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.746609 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" event={"ID":"7fa6485e-01f3-43e7-ac4e-f639cd3983d5","Type":"ContainerStarted","Data":"c57d326641d9c64301f6b062dac97a30fc81a107987a49d8c600b41b39857b88"} Nov 22 09:28:44 crc kubenswrapper[4846]: I1122 09:28:44.746880 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" event={"ID":"7fa6485e-01f3-43e7-ac4e-f639cd3983d5","Type":"ContainerStarted","Data":"47d0b872488453c7cbde6f0cb7e91eacfb09ab0234876728ca45b331c59c1e85"} Nov 22 09:28:45 crc kubenswrapper[4846]: I1122 09:28:45.772752 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" event={"ID":"7fa6485e-01f3-43e7-ac4e-f639cd3983d5","Type":"ContainerStarted","Data":"a2b377a1899baaa66a7acf38111efb611bc4fd98cc0e1d38a2e3e43ebf49baa7"} Nov 22 09:28:45 crc kubenswrapper[4846]: E1122 09:28:45.775135 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" podUID="ab7af809-056a-45c1-bdd0-5e4a8bea02ef" Nov 22 09:28:45 crc kubenswrapper[4846]: E1122 09:28:45.775374 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" podUID="113bd687-6dff-4159-b034-3a27a0683260" Nov 22 09:28:45 crc kubenswrapper[4846]: E1122 09:28:45.775458 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" podUID="eaacbd1d-48b7-40a3-b7e4-48fc074e37fb" Nov 22 09:28:45 crc kubenswrapper[4846]: E1122 09:28:45.776032 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" podUID="f6c75a18-6338-4da3-8b61-a973a8589e66" Nov 22 09:28:45 crc kubenswrapper[4846]: E1122 09:28:45.776094 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" podUID="f4a50a36-b951-4342-b092-c94bea3d860e" Nov 22 09:28:45 crc kubenswrapper[4846]: I1122 09:28:45.876883 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" podStartSLOduration=4.876859934 podStartE2EDuration="4.876859934s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:28:45.873451005 +0000 UTC m=+900.809140664" watchObservedRunningTime="2025-11-22 09:28:45.876859934 +0000 UTC m=+900.812549583" Nov 22 09:28:46 crc kubenswrapper[4846]: I1122 09:28:46.801583 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:53 crc kubenswrapper[4846]: I1122 09:28:53.081636 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-67485f68cb-z25cl" Nov 22 09:28:56 crc kubenswrapper[4846]: E1122 09:28:56.494033 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:70cce55bcf89468c5d468ca2fc317bfc3dc5f2bef1c502df9faca2eb1293ede7" Nov 22 09:28:56 crc kubenswrapper[4846]: E1122 09:28:56.494702 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:70cce55bcf89468c5d468ca2fc317bfc3dc5f2bef1c502df9faca2eb1293ede7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8cm4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-75fb479bcc-qcqgd_openstack-operators(01860b24-58b0-422d-a390-fc783a2f4990): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:28:56 crc kubenswrapper[4846]: E1122 09:28:56.953973 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:5edd825a235f5784d9a65892763c5388c39df1731d0fcbf4ee33408b8c83ac96" Nov 22 09:28:56 crc kubenswrapper[4846]: E1122 09:28:56.954233 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:5edd825a235f5784d9a65892763c5388c39df1731d0fcbf4ee33408b8c83ac96,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r565p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-56f54d6746-gqktx_openstack-operators(539f5169-bf3b-4c3c-828a-8490d4d758d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:28:57 crc kubenswrapper[4846]: E1122 09:28:57.471800 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f" Nov 22 09:28:57 crc kubenswrapper[4846]: E1122 09:28:57.471994 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wqlrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d4bf84b58-qbwl5_openstack-operators(63f03060-74f5-437f-bb06-a2626c791a06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:28:58 crc kubenswrapper[4846]: I1122 09:28:58.625448 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:28:58 crc kubenswrapper[4846]: I1122 09:28:58.626194 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:28:58 crc kubenswrapper[4846]: E1122 09:28:58.691587 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.32:5001/openstack-k8s-operators/infra-operator:ae74f1b96e40d2a83da4d5922452b398eeb06179" Nov 22 09:28:58 crc kubenswrapper[4846]: E1122 09:28:58.691700 4846 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.32:5001/openstack-k8s-operators/infra-operator:ae74f1b96e40d2a83da4d5922452b398eeb06179" Nov 22 09:28:58 crc kubenswrapper[4846]: E1122 09:28:58.691918 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.32:5001/openstack-k8s-operators/infra-operator:ae74f1b96e40d2a83da4d5922452b398eeb06179,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-crzpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-6ccc968f7b-dxpcq_openstack-operators(f7cb339f-9ebe-441d-ae17-43ad2ce13201): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:28:59 crc kubenswrapper[4846]: E1122 09:28:59.405747 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f" Nov 22 09:28:59 crc kubenswrapper[4846]: E1122 09:28:59.405970 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4l8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-8c6448b9f-7f7qj_openstack-operators(895d9e5d-08de-4611-a844-c2db9e8e1839): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:29:00 crc kubenswrapper[4846]: E1122 09:29:00.875688 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" podUID="63f03060-74f5-437f-bb06-a2626c791a06" Nov 22 09:29:00 crc kubenswrapper[4846]: E1122 09:29:00.899515 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" podUID="539f5169-bf3b-4c3c-828a-8490d4d758d8" Nov 22 09:29:00 crc kubenswrapper[4846]: I1122 09:29:00.918054 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" event={"ID":"539f5169-bf3b-4c3c-828a-8490d4d758d8","Type":"ContainerStarted","Data":"74409009ec578097dc7886ef65dd000f202f90f0de01919a37f6c1cd8b7b4a7b"} Nov 22 09:29:00 crc kubenswrapper[4846]: E1122 09:29:00.921374 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:5edd825a235f5784d9a65892763c5388c39df1731d0fcbf4ee33408b8c83ac96\\\"\"" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" podUID="539f5169-bf3b-4c3c-828a-8490d4d758d8" Nov 22 09:29:00 crc kubenswrapper[4846]: I1122 09:29:00.923810 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" event={"ID":"63f03060-74f5-437f-bb06-a2626c791a06","Type":"ContainerStarted","Data":"e4d449332586111edfd9906f6e59242ee68e3f278c950c06299ca43785be92ef"} Nov 22 09:29:00 crc kubenswrapper[4846]: E1122 09:29:00.931347 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" podUID="63f03060-74f5-437f-bb06-a2626c791a06" Nov 22 09:29:01 crc kubenswrapper[4846]: E1122 09:29:01.013542 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" podUID="f7cb339f-9ebe-441d-ae17-43ad2ce13201" Nov 22 09:29:01 crc kubenswrapper[4846]: E1122 09:29:01.029916 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" podUID="01860b24-58b0-422d-a390-fc783a2f4990" Nov 22 09:29:01 crc kubenswrapper[4846]: E1122 09:29:01.313215 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" podUID="895d9e5d-08de-4611-a844-c2db9e8e1839" Nov 22 09:29:01 crc kubenswrapper[4846]: I1122 09:29:01.976130 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" event={"ID":"eaacbd1d-48b7-40a3-b7e4-48fc074e37fb","Type":"ContainerStarted","Data":"e9c0b79915fdb0b11762bcc9674b6770b8a058beaf7337a032472c11bac66716"} Nov 22 09:29:01 crc kubenswrapper[4846]: I1122 09:29:01.977214 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:29:01 crc kubenswrapper[4846]: I1122 09:29:01.997858 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" event={"ID":"f7cb339f-9ebe-441d-ae17-43ad2ce13201","Type":"ContainerStarted","Data":"bffcf9d033780622ffd5f8563d97c2b719541c273005c62e92d56419957bd539"} Nov 22 09:29:02 crc kubenswrapper[4846]: E1122 09:29:02.000269 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.32:5001/openstack-k8s-operators/infra-operator:ae74f1b96e40d2a83da4d5922452b398eeb06179\\\"\"" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" podUID="f7cb339f-9ebe-441d-ae17-43ad2ce13201" Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.018986 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" event={"ID":"4dac3679-62ae-408f-b3ba-1809daaceb47","Type":"ContainerStarted","Data":"142c7ea7bc2f45dcf495c7423ad43e6bc99d2312c08335316500ae869c45488c"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.051483 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" podStartSLOduration=4.520832474 podStartE2EDuration="21.051463253s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.900395715 +0000 UTC m=+898.836085364" lastFinishedPulling="2025-11-22 09:29:00.431026494 +0000 UTC m=+915.366716143" observedRunningTime="2025-11-22 09:29:02.042259885 +0000 UTC m=+916.977949534" watchObservedRunningTime="2025-11-22 09:29:02.051463253 +0000 UTC m=+916.987152902" Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.054328 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" event={"ID":"c4abfa7d-5927-41f1-af53-bc1ea6878bc1","Type":"ContainerStarted","Data":"5d09712701809b6fe48a9dc886fa66e965400e5e4abdb41f5881ea7f082304d6"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.059453 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" event={"ID":"8cec188d-f264-4a62-96f1-93e309820fe6","Type":"ContainerStarted","Data":"946a28866dd1b4346ef0f240b55018d5e18727e96a8513de5319e6260fd1f66d"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.084252 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" event={"ID":"766c68ab-9022-4efd-84a3-af4aedf7d7b2","Type":"ContainerStarted","Data":"c57cde866734afb26f28e6f840b849c7316ed9d717a1e407137e5d472b063cf3"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.114194 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" event={"ID":"22e226e0-ebce-4d63-9379-109fe06b88da","Type":"ContainerStarted","Data":"cfce134856133d7891fbc78696e3ba3839934051a7f3a0e1d159897a482b5a11"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.134560 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" event={"ID":"01860b24-58b0-422d-a390-fc783a2f4990","Type":"ContainerStarted","Data":"38e983541b3a24f04e0d557bc759408a88277635836ebcd672848781bc29cd7b"} Nov 22 09:29:02 crc kubenswrapper[4846]: E1122 09:29:02.136746 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:70cce55bcf89468c5d468ca2fc317bfc3dc5f2bef1c502df9faca2eb1293ede7\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" podUID="01860b24-58b0-422d-a390-fc783a2f4990" Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.152349 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" event={"ID":"f1008fc2-d21a-4775-8505-12116c0a1d94","Type":"ContainerStarted","Data":"64f527a2fd616b696b8dc8c97cc068b8107c817d5eb3ca569c4b6b7e4a546414"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.162017 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" event={"ID":"facf2ae5-028f-4413-a2d6-e503489ae5f3","Type":"ContainerStarted","Data":"729e59170c3b5f7a848cee93d2b79c96da1f9f826a61c9e63e3bd91ffec8ce29"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.176180 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" event={"ID":"5454b9eb-3a18-47d6-ba8e-1b7230659b26","Type":"ContainerStarted","Data":"c2aa3dbd8896b1245c74414cde876b38b954837776cfa14f3c5947d66b21de09"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.192771 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" event={"ID":"e84b7960-5cd2-4557-9b3c-a98ed4784006","Type":"ContainerStarted","Data":"ec590026e8c48a3e21436cc8d96e880894321ccb7732ad118c8a6391349ce5dd"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.208251 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" event={"ID":"07d22ef0-2712-4daf-a620-081fee41f68f","Type":"ContainerStarted","Data":"c864a9c314363dba9724bd36761e6be6009a771b95d822e34601722a0ddcce1f"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.216451 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" event={"ID":"671de1b8-d3f3-4a1e-8572-e2840bf58e17","Type":"ContainerStarted","Data":"7804024fff2671b9252a8cd24e044e3dc3d64a87d5a60c99dc8c30ed59ca9479"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.218791 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" event={"ID":"371bad3e-fcc3-42c5-a563-fc7d6aa5f275","Type":"ContainerStarted","Data":"f36ae1487a589549f057903a1658c490ca56f77ae8ccb9dd45686d618a48330a"} Nov 22 09:29:02 crc kubenswrapper[4846]: I1122 09:29:02.233211 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" event={"ID":"895d9e5d-08de-4611-a844-c2db9e8e1839","Type":"ContainerStarted","Data":"47a61ddbd7c846b0a371d10505423f9711d04929f6a4f8f0853e359f9e60d805"} Nov 22 09:29:02 crc kubenswrapper[4846]: E1122 09:29:02.234696 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:5edd825a235f5784d9a65892763c5388c39df1731d0fcbf4ee33408b8c83ac96\\\"\"" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" podUID="539f5169-bf3b-4c3c-828a-8490d4d758d8" Nov 22 09:29:02 crc kubenswrapper[4846]: E1122 09:29:02.238272 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" podUID="895d9e5d-08de-4611-a844-c2db9e8e1839" Nov 22 09:29:02 crc kubenswrapper[4846]: E1122 09:29:02.238496 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" podUID="63f03060-74f5-437f-bb06-a2626c791a06" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.251622 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" event={"ID":"8cec188d-f264-4a62-96f1-93e309820fe6","Type":"ContainerStarted","Data":"b21fb2c8cc3c910d4f779deb6f321fdec792c6b640bc50db9aae6fa02aec678d"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.253102 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.258080 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" event={"ID":"671de1b8-d3f3-4a1e-8572-e2840bf58e17","Type":"ContainerStarted","Data":"2405c419124ad2e8c2a513b55e6f450cb780a3b5eece2c252b479d0fe4d0536a"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.258195 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.263477 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" event={"ID":"766c68ab-9022-4efd-84a3-af4aedf7d7b2","Type":"ContainerStarted","Data":"33c04b86f7f3eca1cdd820c4e87e30fce94e5b685b62f7e2d82a53ac5b2f8eb0"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.265518 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.269389 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" event={"ID":"113bd687-6dff-4159-b034-3a27a0683260","Type":"ContainerStarted","Data":"81c06ccbadc9a6c55083352c46d590ba69c4de5606a893fd558cac7143560f02"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.269873 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.279616 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" podStartSLOduration=6.621540577 podStartE2EDuration="22.279593376s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.714567429 +0000 UTC m=+898.650257078" lastFinishedPulling="2025-11-22 09:28:59.372620228 +0000 UTC m=+914.308309877" observedRunningTime="2025-11-22 09:29:03.273735975 +0000 UTC m=+918.209425624" watchObservedRunningTime="2025-11-22 09:29:03.279593376 +0000 UTC m=+918.215283025" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.280818 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" event={"ID":"e84b7960-5cd2-4557-9b3c-a98ed4784006","Type":"ContainerStarted","Data":"bbb1a6634f8663f0ead81514a3f42af10a46cf271ba74fd89a071724a307efc9"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.280872 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.292220 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" event={"ID":"07d22ef0-2712-4daf-a620-081fee41f68f","Type":"ContainerStarted","Data":"c8cdfcbb26adc33b52ce07bb7596f96408c18ecb24e53d995ee81d3e4a3c8956"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.292627 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.303507 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" podStartSLOduration=6.668269079 podStartE2EDuration="22.303487363s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.737483627 +0000 UTC m=+898.673173276" lastFinishedPulling="2025-11-22 09:28:59.372701911 +0000 UTC m=+914.308391560" observedRunningTime="2025-11-22 09:29:03.29825447 +0000 UTC m=+918.233944119" watchObservedRunningTime="2025-11-22 09:29:03.303487363 +0000 UTC m=+918.239177012" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.307752 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" event={"ID":"4dac3679-62ae-408f-b3ba-1809daaceb47","Type":"ContainerStarted","Data":"39a71f46070b494dc00616613805c65bcbad18d708f15988e910184324e21682"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.308483 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.318164 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" event={"ID":"c4abfa7d-5927-41f1-af53-bc1ea6878bc1","Type":"ContainerStarted","Data":"7199c4d87f498dbbcfda75fc6b813e10da8d19fe30cb6374624b9a6e9cabea0c"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.318789 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.326352 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" event={"ID":"f1008fc2-d21a-4775-8505-12116c0a1d94","Type":"ContainerStarted","Data":"621fb5a327699b8683af411b1e58e306cee3d393db2fe15349a070f955237b17"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.330102 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.342880 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" podStartSLOduration=6.273596193 podStartE2EDuration="22.34285108s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.302863847 +0000 UTC m=+898.238553496" lastFinishedPulling="2025-11-22 09:28:59.372118724 +0000 UTC m=+914.307808383" observedRunningTime="2025-11-22 09:29:03.330604963 +0000 UTC m=+918.266294612" watchObservedRunningTime="2025-11-22 09:29:03.34285108 +0000 UTC m=+918.278540729" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.343408 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" event={"ID":"facf2ae5-028f-4413-a2d6-e503489ae5f3","Type":"ContainerStarted","Data":"1c4fdee9570c93177a5efa1a0a952e437a9795f5a57bd37d6eef1e58ef264150"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.343481 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.349366 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" event={"ID":"371bad3e-fcc3-42c5-a563-fc7d6aa5f275","Type":"ContainerStarted","Data":"47f08c15effc945d9714aebe85d7ef9544cc6013727f6d28c6cacb02e5b335d6"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.350221 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.361452 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" podStartSLOduration=5.812908493 podStartE2EDuration="22.361416902s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.869268009 +0000 UTC m=+898.804957658" lastFinishedPulling="2025-11-22 09:29:00.417776418 +0000 UTC m=+915.353466067" observedRunningTime="2025-11-22 09:29:03.350651888 +0000 UTC m=+918.286341537" watchObservedRunningTime="2025-11-22 09:29:03.361416902 +0000 UTC m=+918.297106551" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.366470 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" event={"ID":"22e226e0-ebce-4d63-9379-109fe06b88da","Type":"ContainerStarted","Data":"5d225e0a07006afc93177a7af426f3cc8cc9794174e0d64a2d7455556c2b958b"} Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.367365 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.376469 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" podStartSLOduration=4.837405644 podStartE2EDuration="22.37644168s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.75827523 +0000 UTC m=+897.693964879" lastFinishedPulling="2025-11-22 09:29:00.297311226 +0000 UTC m=+915.233000915" observedRunningTime="2025-11-22 09:29:03.36856519 +0000 UTC m=+918.304254839" watchObservedRunningTime="2025-11-22 09:29:03.37644168 +0000 UTC m=+918.312131329" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.399402 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" podStartSLOduration=5.404087545 podStartE2EDuration="22.399367688s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.303133165 +0000 UTC m=+898.238822814" lastFinishedPulling="2025-11-22 09:29:00.298413308 +0000 UTC m=+915.234102957" observedRunningTime="2025-11-22 09:29:03.391500079 +0000 UTC m=+918.327189738" watchObservedRunningTime="2025-11-22 09:29:03.399367688 +0000 UTC m=+918.335057337" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.407182 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" event={"ID":"5454b9eb-3a18-47d6-ba8e-1b7230659b26","Type":"ContainerStarted","Data":"404919379fec2f7dd40ee74f7beccb06d42aa1ad15dbf1640ea07c0b58146d97"} Nov 22 09:29:03 crc kubenswrapper[4846]: E1122 09:29:03.415201 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.32:5001/openstack-k8s-operators/infra-operator:ae74f1b96e40d2a83da4d5922452b398eeb06179\\\"\"" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" podUID="f7cb339f-9ebe-441d-ae17-43ad2ce13201" Nov 22 09:29:03 crc kubenswrapper[4846]: E1122 09:29:03.415215 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:70cce55bcf89468c5d468ca2fc317bfc3dc5f2bef1c502df9faca2eb1293ede7\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" podUID="01860b24-58b0-422d-a390-fc783a2f4990" Nov 22 09:29:03 crc kubenswrapper[4846]: E1122 09:29:03.415300 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" podUID="895d9e5d-08de-4611-a844-c2db9e8e1839" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.423758 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" podStartSLOduration=6.384829613 podStartE2EDuration="23.423733688s" podCreationTimestamp="2025-11-22 09:28:40 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.333755405 +0000 UTC m=+897.269445054" lastFinishedPulling="2025-11-22 09:28:59.37265949 +0000 UTC m=+914.308349129" observedRunningTime="2025-11-22 09:29:03.423201193 +0000 UTC m=+918.358890852" watchObservedRunningTime="2025-11-22 09:29:03.423733688 +0000 UTC m=+918.359423337" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.503951 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" podStartSLOduration=5.514839294 podStartE2EDuration="22.503920436s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.308140171 +0000 UTC m=+898.243829820" lastFinishedPulling="2025-11-22 09:29:00.297221303 +0000 UTC m=+915.232910962" observedRunningTime="2025-11-22 09:29:03.452859547 +0000 UTC m=+918.388549196" watchObservedRunningTime="2025-11-22 09:29:03.503920436 +0000 UTC m=+918.439610095" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.505881 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" podStartSLOduration=5.963662195 podStartE2EDuration="23.505872543s" podCreationTimestamp="2025-11-22 09:28:40 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.75825973 +0000 UTC m=+897.693949369" lastFinishedPulling="2025-11-22 09:29:00.300470028 +0000 UTC m=+915.236159717" observedRunningTime="2025-11-22 09:29:03.50541661 +0000 UTC m=+918.441106259" watchObservedRunningTime="2025-11-22 09:29:03.505872543 +0000 UTC m=+918.441562192" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.562006 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" podStartSLOduration=5.524515816 podStartE2EDuration="22.561985159s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.262354706 +0000 UTC m=+898.198044355" lastFinishedPulling="2025-11-22 09:29:00.299824049 +0000 UTC m=+915.235513698" observedRunningTime="2025-11-22 09:29:03.536325071 +0000 UTC m=+918.472014720" watchObservedRunningTime="2025-11-22 09:29:03.561985159 +0000 UTC m=+918.497674808" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.590078 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" podStartSLOduration=6.151488853 podStartE2EDuration="22.590028846s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.933579691 +0000 UTC m=+897.869269340" lastFinishedPulling="2025-11-22 09:28:59.372119684 +0000 UTC m=+914.307809333" observedRunningTime="2025-11-22 09:29:03.563497953 +0000 UTC m=+918.499187602" watchObservedRunningTime="2025-11-22 09:29:03.590028846 +0000 UTC m=+918.525718505" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.613270 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" podStartSLOduration=6.052004553 podStartE2EDuration="22.613244923s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.737963481 +0000 UTC m=+898.673653130" lastFinishedPulling="2025-11-22 09:29:00.299203811 +0000 UTC m=+915.234893500" observedRunningTime="2025-11-22 09:29:03.603966793 +0000 UTC m=+918.539656442" watchObservedRunningTime="2025-11-22 09:29:03.613244923 +0000 UTC m=+918.548934572" Nov 22 09:29:03 crc kubenswrapper[4846]: I1122 09:29:03.647299 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" podStartSLOduration=6.114420132 podStartE2EDuration="22.647278605s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.762914798 +0000 UTC m=+898.698604447" lastFinishedPulling="2025-11-22 09:29:00.295773231 +0000 UTC m=+915.231462920" observedRunningTime="2025-11-22 09:29:03.639423966 +0000 UTC m=+918.575113615" watchObservedRunningTime="2025-11-22 09:29:03.647278605 +0000 UTC m=+918.582968254" Nov 22 09:29:04 crc kubenswrapper[4846]: I1122 09:29:04.415688 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.442776 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" event={"ID":"f6c75a18-6338-4da3-8b61-a973a8589e66","Type":"ContainerStarted","Data":"8e622b2bcabfee2ae5a444a4600fb1554a7f99be280a2a32fa3e5e81cabb9b6e"} Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.446826 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" event={"ID":"f4a50a36-b951-4342-b092-c94bea3d860e","Type":"ContainerStarted","Data":"9179fc867a1f393d765ada8858b755f47ce05a955768252a090f1ac78bcae59d"} Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.447184 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.449750 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" event={"ID":"ab7af809-056a-45c1-bdd0-5e4a8bea02ef","Type":"ContainerStarted","Data":"a230b31a240498d80c3e6e1f4601f68114c67d0a3e57f41f5b5f76641f3869b8"} Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.450017 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.466801 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7v42k" podStartSLOduration=3.326210709 podStartE2EDuration="26.466775853s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.886331275 +0000 UTC m=+898.822020924" lastFinishedPulling="2025-11-22 09:29:07.026896419 +0000 UTC m=+921.962586068" observedRunningTime="2025-11-22 09:29:07.465530596 +0000 UTC m=+922.401220265" watchObservedRunningTime="2025-11-22 09:29:07.466775853 +0000 UTC m=+922.402465512" Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.496192 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" podStartSLOduration=3.321663128 podStartE2EDuration="26.4961642s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.780061698 +0000 UTC m=+898.715751337" lastFinishedPulling="2025-11-22 09:29:06.95456277 +0000 UTC m=+921.890252409" observedRunningTime="2025-11-22 09:29:07.495293284 +0000 UTC m=+922.430982933" watchObservedRunningTime="2025-11-22 09:29:07.4961642 +0000 UTC m=+922.431853849" Nov 22 09:29:07 crc kubenswrapper[4846]: I1122 09:29:07.524753 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" podStartSLOduration=3.3489140920000002 podStartE2EDuration="26.524726522s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.786416924 +0000 UTC m=+898.722106573" lastFinishedPulling="2025-11-22 09:29:06.962229354 +0000 UTC m=+921.897919003" observedRunningTime="2025-11-22 09:29:07.523696672 +0000 UTC m=+922.459386331" watchObservedRunningTime="2025-11-22 09:29:07.524726522 +0000 UTC m=+922.460416171" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.359244 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-rb6cp" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.638807 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-r4xp2" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.743941 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7969689c84-fwlcn" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.745092 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-9jbkf" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.833148 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58f887965d-c8ssm" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.837874 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-vt8xd" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.919303 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-jwx4v" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.930818 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-9wwm2" Nov 22 09:29:11 crc kubenswrapper[4846]: I1122 09:29:11.957365 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-t8nfb" Nov 22 09:29:12 crc kubenswrapper[4846]: I1122 09:29:12.048354 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ltkkr" Nov 22 09:29:12 crc kubenswrapper[4846]: I1122 09:29:12.076373 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-458hx" Nov 22 09:29:12 crc kubenswrapper[4846]: I1122 09:29:12.124894 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-7lg8k" Nov 22 09:29:12 crc kubenswrapper[4846]: I1122 09:29:12.142873 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d656998f4-cnhgp" Nov 22 09:29:12 crc kubenswrapper[4846]: I1122 09:29:12.205810 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-b4c496f69-644f7" Nov 22 09:29:12 crc kubenswrapper[4846]: I1122 09:29:12.689411 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-thll2" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.559124 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" event={"ID":"f7cb339f-9ebe-441d-ae17-43ad2ce13201","Type":"ContainerStarted","Data":"8b1920316427dc61ec5c3c6e71592837c16eb74b168c4b21a70e6237a1856c51"} Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.560182 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.561727 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" event={"ID":"895d9e5d-08de-4611-a844-c2db9e8e1839","Type":"ContainerStarted","Data":"62b728b6e0a66c036b58d91b434275bee70803bbd34d2de3b5c1c0e45cd19e91"} Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.561950 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.563776 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" event={"ID":"01860b24-58b0-422d-a390-fc783a2f4990","Type":"ContainerStarted","Data":"2df63508b585e6d82107963801aa07fc740dd3069222757322863b012a8e73b9"} Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.564032 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.565198 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" event={"ID":"539f5169-bf3b-4c3c-828a-8490d4d758d8","Type":"ContainerStarted","Data":"01800ef25c7467069ba6a72725409b9d3e283376c379eacc582e37bea388b43e"} Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.565418 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.567641 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" event={"ID":"63f03060-74f5-437f-bb06-a2626c791a06","Type":"ContainerStarted","Data":"b8af5c0594d0528d9cb44078a5a5b2dd796964964851c05a0c6c683323c6b2de"} Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.567832 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.588386 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" podStartSLOduration=2.657783164 podStartE2EDuration="40.588367732s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.830263149 +0000 UTC m=+897.765952798" lastFinishedPulling="2025-11-22 09:29:20.760847717 +0000 UTC m=+935.696537366" observedRunningTime="2025-11-22 09:29:21.587528327 +0000 UTC m=+936.523217976" watchObservedRunningTime="2025-11-22 09:29:21.588367732 +0000 UTC m=+936.524057381" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.614396 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" podStartSLOduration=3.347185528 podStartE2EDuration="41.61437895s" podCreationTimestamp="2025-11-22 09:28:40 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.620918016 +0000 UTC m=+897.556607665" lastFinishedPulling="2025-11-22 09:29:20.888111438 +0000 UTC m=+935.823801087" observedRunningTime="2025-11-22 09:29:21.610405484 +0000 UTC m=+936.546095133" watchObservedRunningTime="2025-11-22 09:29:21.61437895 +0000 UTC m=+936.550068599" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.630244 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" podStartSLOduration=3.524735937 podStartE2EDuration="40.630223812s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.570370635 +0000 UTC m=+898.506060274" lastFinishedPulling="2025-11-22 09:29:20.6758585 +0000 UTC m=+935.611548149" observedRunningTime="2025-11-22 09:29:21.629605404 +0000 UTC m=+936.565295053" watchObservedRunningTime="2025-11-22 09:29:21.630223812 +0000 UTC m=+936.565913461" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.651873 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" podStartSLOduration=2.628884802 podStartE2EDuration="40.651853643s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:42.86355235 +0000 UTC m=+897.799241999" lastFinishedPulling="2025-11-22 09:29:20.886521191 +0000 UTC m=+935.822210840" observedRunningTime="2025-11-22 09:29:21.648282749 +0000 UTC m=+936.583972398" watchObservedRunningTime="2025-11-22 09:29:21.651853643 +0000 UTC m=+936.587543282" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.687635 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" podStartSLOduration=3.63114494 podStartE2EDuration="40.687602545s" podCreationTimestamp="2025-11-22 09:28:41 +0000 UTC" firstStartedPulling="2025-11-22 09:28:43.830750876 +0000 UTC m=+898.766440525" lastFinishedPulling="2025-11-22 09:29:20.887208481 +0000 UTC m=+935.822898130" observedRunningTime="2025-11-22 09:29:21.678611113 +0000 UTC m=+936.614300762" watchObservedRunningTime="2025-11-22 09:29:21.687602545 +0000 UTC m=+936.623292194" Nov 22 09:29:21 crc kubenswrapper[4846]: I1122 09:29:21.897037 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-jbjc4" Nov 22 09:29:28 crc kubenswrapper[4846]: I1122 09:29:28.626191 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:29:28 crc kubenswrapper[4846]: I1122 09:29:28.627029 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:29:31 crc kubenswrapper[4846]: I1122 09:29:31.631969 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcqgd" Nov 22 09:29:31 crc kubenswrapper[4846]: I1122 09:29:31.735841 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-gqktx" Nov 22 09:29:31 crc kubenswrapper[4846]: I1122 09:29:31.872008 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6ccc968f7b-dxpcq" Nov 22 09:29:32 crc kubenswrapper[4846]: I1122 09:29:32.173127 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-qbwl5" Nov 22 09:29:32 crc kubenswrapper[4846]: I1122 09:29:32.311167 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-7f7qj" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.114015 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jlzw6"] Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.116550 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.118984 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jlzw6"] Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.121573 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.121710 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.121800 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.121913 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nffqw" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.197414 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c9v68"] Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.199545 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.203344 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.219075 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdks8\" (UniqueName: \"kubernetes.io/projected/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-kube-api-access-rdks8\") pod \"dnsmasq-dns-675f4bcbfc-jlzw6\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.219142 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-config\") pod \"dnsmasq-dns-675f4bcbfc-jlzw6\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.220777 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c9v68"] Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.320534 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-config\") pod \"dnsmasq-dns-675f4bcbfc-jlzw6\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.320625 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.320683 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-config\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.320728 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6v2\" (UniqueName: \"kubernetes.io/projected/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-kube-api-access-6v6v2\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.320783 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdks8\" (UniqueName: \"kubernetes.io/projected/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-kube-api-access-rdks8\") pod \"dnsmasq-dns-675f4bcbfc-jlzw6\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.321842 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-config\") pod \"dnsmasq-dns-675f4bcbfc-jlzw6\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.341198 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdks8\" (UniqueName: \"kubernetes.io/projected/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-kube-api-access-rdks8\") pod \"dnsmasq-dns-675f4bcbfc-jlzw6\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.421903 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.421994 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-config\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.422038 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6v2\" (UniqueName: \"kubernetes.io/projected/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-kube-api-access-6v6v2\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.423084 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-config\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.423261 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.439890 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6v2\" (UniqueName: \"kubernetes.io/projected/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-kube-api-access-6v6v2\") pod \"dnsmasq-dns-78dd6ddcc-c9v68\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.451566 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.519349 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:29:46 crc kubenswrapper[4846]: I1122 09:29:46.901624 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jlzw6"] Nov 22 09:29:47 crc kubenswrapper[4846]: W1122 09:29:47.014653 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51b8ef7c_1ba2_41f6_9184_5a86e5e1a7f6.slice/crio-654ea3c3d57cd03506d2969872b90230092dc27c366693ba71dc93fc01c58d20 WatchSource:0}: Error finding container 654ea3c3d57cd03506d2969872b90230092dc27c366693ba71dc93fc01c58d20: Status 404 returned error can't find the container with id 654ea3c3d57cd03506d2969872b90230092dc27c366693ba71dc93fc01c58d20 Nov 22 09:29:47 crc kubenswrapper[4846]: I1122 09:29:47.017069 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c9v68"] Nov 22 09:29:47 crc kubenswrapper[4846]: I1122 09:29:47.823304 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" event={"ID":"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6","Type":"ContainerStarted","Data":"654ea3c3d57cd03506d2969872b90230092dc27c366693ba71dc93fc01c58d20"} Nov 22 09:29:47 crc kubenswrapper[4846]: I1122 09:29:47.825619 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" event={"ID":"0138b3cd-807f-4bd7-b6ae-ccf530df1faf","Type":"ContainerStarted","Data":"8d6b264506d6c2336c8b1a01b675d91dc2240cd61cdefc6f442988349af1f2b4"} Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.190827 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jlzw6"] Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.217857 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mccpx"] Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.219338 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.232574 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mccpx"] Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.384542 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.384597 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zhh\" (UniqueName: \"kubernetes.io/projected/a8cc2084-0252-4943-8f6e-c415924a222f-kube-api-access-b7zhh\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.384662 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-config\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.487960 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.488025 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zhh\" (UniqueName: \"kubernetes.io/projected/a8cc2084-0252-4943-8f6e-c415924a222f-kube-api-access-b7zhh\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.488103 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-config\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.488989 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.489861 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-config\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.517189 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zhh\" (UniqueName: \"kubernetes.io/projected/a8cc2084-0252-4943-8f6e-c415924a222f-kube-api-access-b7zhh\") pod \"dnsmasq-dns-666b6646f7-mccpx\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.536101 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c9v68"] Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.558791 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.561697 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g6cfm"] Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.564266 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.596225 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g6cfm"] Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.691742 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86bfz\" (UniqueName: \"kubernetes.io/projected/ac74ecfd-8981-4682-847e-b8c23742bfd0-kube-api-access-86bfz\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.691872 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.691895 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-config\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.793584 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.793631 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-config\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.793668 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86bfz\" (UniqueName: \"kubernetes.io/projected/ac74ecfd-8981-4682-847e-b8c23742bfd0-kube-api-access-86bfz\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.796243 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-config\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.796671 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.832125 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86bfz\" (UniqueName: \"kubernetes.io/projected/ac74ecfd-8981-4682-847e-b8c23742bfd0-kube-api-access-86bfz\") pod \"dnsmasq-dns-57d769cc4f-g6cfm\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:49 crc kubenswrapper[4846]: I1122 09:29:49.897996 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.181482 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mccpx"] Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.320954 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g6cfm"] Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.379305 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.382450 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.393875 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.393958 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.394003 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.394103 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.394101 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.394152 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ddrh2" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.394161 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.404347 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505427 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-config-data\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505563 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505600 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505641 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505720 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505753 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.505962 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/899cf49d-9541-4f23-b1a2-887324973fb1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.506143 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/899cf49d-9541-4f23-b1a2-887324973fb1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.506311 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2wpk\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-kube-api-access-n2wpk\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.506352 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.506370 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608532 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608603 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608646 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/899cf49d-9541-4f23-b1a2-887324973fb1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608686 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/899cf49d-9541-4f23-b1a2-887324973fb1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608743 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2wpk\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-kube-api-access-n2wpk\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608766 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608786 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608808 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-config-data\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608831 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608850 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.608879 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.609275 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.609562 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.609825 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.610312 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-config-data\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.610858 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.611572 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.616186 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.617268 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/899cf49d-9541-4f23-b1a2-887324973fb1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.621153 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.626009 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/899cf49d-9541-4f23-b1a2-887324973fb1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.629121 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2wpk\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-kube-api-access-n2wpk\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.669062 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.709983 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.741199 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.747876 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.754401 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.754571 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.754684 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.754695 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.754708 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.755524 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.755814 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nl27q" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.757823 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812738 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812825 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812879 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812908 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812930 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812948 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.812967 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkdt\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-kube-api-access-pgkdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.813080 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.813113 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.813130 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.813190 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.864777 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" event={"ID":"ac74ecfd-8981-4682-847e-b8c23742bfd0","Type":"ContainerStarted","Data":"de99244457e309e9caa068939e95fb0f1d687ab83d4ba075c44510b1a8d77e2b"} Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.907414 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" event={"ID":"a8cc2084-0252-4943-8f6e-c415924a222f","Type":"ContainerStarted","Data":"32568a6986bb1501dd7d2501a23f92c527c376249deb3ee424c8d009d78ec272"} Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.915833 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.915897 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.915926 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.915946 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.915969 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.915993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkdt\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-kube-api-access-pgkdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916014 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916035 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916073 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916120 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916234 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916664 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.916724 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.918866 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.919600 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.921491 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.926644 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.929456 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.929551 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.930114 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.932120 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.950118 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:50 crc kubenswrapper[4846]: I1122 09:29:50.952993 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkdt\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-kube-api-access-pgkdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.136441 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.304013 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.741012 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.926838 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.935694 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.939608 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wbf95" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.939995 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.940246 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.940551 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.950172 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 22 09:29:51 crc kubenswrapper[4846]: I1122 09:29:51.970636 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.067451 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93cad534-86a5-4420-951f-859efc86a70a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.068366 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsxf7\" (UniqueName: \"kubernetes.io/projected/93cad534-86a5-4420-951f-859efc86a70a-kube-api-access-gsxf7\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.068400 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.068874 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93cad534-86a5-4420-951f-859efc86a70a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.068957 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-kolla-config\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.068985 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cad534-86a5-4420-951f-859efc86a70a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.069034 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-config-data-default\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.069220 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170762 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsxf7\" (UniqueName: \"kubernetes.io/projected/93cad534-86a5-4420-951f-859efc86a70a-kube-api-access-gsxf7\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170817 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170879 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93cad534-86a5-4420-951f-859efc86a70a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170906 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-kolla-config\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170923 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cad534-86a5-4420-951f-859efc86a70a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170953 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-config-data-default\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.170982 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.171005 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93cad534-86a5-4420-951f-859efc86a70a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.172083 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93cad534-86a5-4420-951f-859efc86a70a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.172284 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.172821 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-config-data-default\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.173654 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-kolla-config\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.174982 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93cad534-86a5-4420-951f-859efc86a70a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.186038 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cad534-86a5-4420-951f-859efc86a70a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.188665 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsxf7\" (UniqueName: \"kubernetes.io/projected/93cad534-86a5-4420-951f-859efc86a70a-kube-api-access-gsxf7\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.210147 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.211671 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93cad534-86a5-4420-951f-859efc86a70a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93cad534-86a5-4420-951f-859efc86a70a\") " pod="openstack/openstack-galera-0" Nov 22 09:29:52 crc kubenswrapper[4846]: I1122 09:29:52.267637 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.269801 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.273306 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.283510 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.285773 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.303992 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.304269 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.309020 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ptzbz" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408389 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408500 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdlnt\" (UniqueName: \"kubernetes.io/projected/e4e66c89-9999-4584-a149-2c18589a522a-kube-api-access-pdlnt\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408544 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408567 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4e66c89-9999-4584-a149-2c18589a522a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408585 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e66c89-9999-4584-a149-2c18589a522a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408605 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e66c89-9999-4584-a149-2c18589a522a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408642 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.408661 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.511516 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512003 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdlnt\" (UniqueName: \"kubernetes.io/projected/e4e66c89-9999-4584-a149-2c18589a522a-kube-api-access-pdlnt\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512080 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512102 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4e66c89-9999-4584-a149-2c18589a522a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512124 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e66c89-9999-4584-a149-2c18589a522a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512180 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e66c89-9999-4584-a149-2c18589a522a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512249 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512275 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.512838 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.519325 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.519534 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4e66c89-9999-4584-a149-2c18589a522a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.519838 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.520733 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e66c89-9999-4584-a149-2c18589a522a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.523601 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e66c89-9999-4584-a149-2c18589a522a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.524320 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e66c89-9999-4584-a149-2c18589a522a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.547975 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdlnt\" (UniqueName: \"kubernetes.io/projected/e4e66c89-9999-4584-a149-2c18589a522a-kube-api-access-pdlnt\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.559286 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4e66c89-9999-4584-a149-2c18589a522a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.636775 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.653142 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.656344 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.665933 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.668286 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.668481 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.668574 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xqrgv" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.714895 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4d174fc1-bcf2-4812-9766-875d3ca3efe5-kolla-config\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.714940 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d174fc1-bcf2-4812-9766-875d3ca3efe5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.714977 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d174fc1-bcf2-4812-9766-875d3ca3efe5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.715035 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5k9g\" (UniqueName: \"kubernetes.io/projected/4d174fc1-bcf2-4812-9766-875d3ca3efe5-kube-api-access-z5k9g\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.715098 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d174fc1-bcf2-4812-9766-875d3ca3efe5-config-data\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.816682 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5k9g\" (UniqueName: \"kubernetes.io/projected/4d174fc1-bcf2-4812-9766-875d3ca3efe5-kube-api-access-z5k9g\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.816777 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d174fc1-bcf2-4812-9766-875d3ca3efe5-config-data\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.816823 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4d174fc1-bcf2-4812-9766-875d3ca3efe5-kolla-config\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.816841 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d174fc1-bcf2-4812-9766-875d3ca3efe5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.816867 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d174fc1-bcf2-4812-9766-875d3ca3efe5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.829715 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d174fc1-bcf2-4812-9766-875d3ca3efe5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.829979 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4d174fc1-bcf2-4812-9766-875d3ca3efe5-kolla-config\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.830122 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d174fc1-bcf2-4812-9766-875d3ca3efe5-config-data\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.833891 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d174fc1-bcf2-4812-9766-875d3ca3efe5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:53 crc kubenswrapper[4846]: I1122 09:29:53.840406 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5k9g\" (UniqueName: \"kubernetes.io/projected/4d174fc1-bcf2-4812-9766-875d3ca3efe5-kube-api-access-z5k9g\") pod \"memcached-0\" (UID: \"4d174fc1-bcf2-4812-9766-875d3ca3efe5\") " pod="openstack/memcached-0" Nov 22 09:29:54 crc kubenswrapper[4846]: I1122 09:29:54.001291 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.598423 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.600074 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.606350 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.606567 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gcfwz" Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.665377 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2hk\" (UniqueName: \"kubernetes.io/projected/340633f3-603b-416e-924a-2938adbde84f-kube-api-access-sn2hk\") pod \"kube-state-metrics-0\" (UID: \"340633f3-603b-416e-924a-2938adbde84f\") " pod="openstack/kube-state-metrics-0" Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.767537 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2hk\" (UniqueName: \"kubernetes.io/projected/340633f3-603b-416e-924a-2938adbde84f-kube-api-access-sn2hk\") pod \"kube-state-metrics-0\" (UID: \"340633f3-603b-416e-924a-2938adbde84f\") " pod="openstack/kube-state-metrics-0" Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.824608 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2hk\" (UniqueName: \"kubernetes.io/projected/340633f3-603b-416e-924a-2938adbde84f-kube-api-access-sn2hk\") pod \"kube-state-metrics-0\" (UID: \"340633f3-603b-416e-924a-2938adbde84f\") " pod="openstack/kube-state-metrics-0" Nov 22 09:29:55 crc kubenswrapper[4846]: I1122 09:29:55.934897 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 09:29:57 crc kubenswrapper[4846]: W1122 09:29:57.048928 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899cf49d_9541_4f23_b1a2_887324973fb1.slice/crio-e01b8f720907754fd524f2a9a8c3511a96698a99fb85d8aa447eda44caf60122 WatchSource:0}: Error finding container e01b8f720907754fd524f2a9a8c3511a96698a99fb85d8aa447eda44caf60122: Status 404 returned error can't find the container with id e01b8f720907754fd524f2a9a8c3511a96698a99fb85d8aa447eda44caf60122 Nov 22 09:29:57 crc kubenswrapper[4846]: W1122 09:29:57.051429 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98fa314c_0c1f_4dbc_86e0_1f29fd0b52c6.slice/crio-c24757bc17482d1dfbed934a5590a4bddb270beb159f25100908faab857febfc WatchSource:0}: Error finding container c24757bc17482d1dfbed934a5590a4bddb270beb159f25100908faab857febfc: Status 404 returned error can't find the container with id c24757bc17482d1dfbed934a5590a4bddb270beb159f25100908faab857febfc Nov 22 09:29:57 crc kubenswrapper[4846]: I1122 09:29:57.051756 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.046302 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"899cf49d-9541-4f23-b1a2-887324973fb1","Type":"ContainerStarted","Data":"e01b8f720907754fd524f2a9a8c3511a96698a99fb85d8aa447eda44caf60122"} Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.046784 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6","Type":"ContainerStarted","Data":"c24757bc17482d1dfbed934a5590a4bddb270beb159f25100908faab857febfc"} Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.626066 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.626168 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.626300 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.627226 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bac90441ca960230e02742d36a5b95524d70b371a6ee7e32b617df01413fca78"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.627309 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://bac90441ca960230e02742d36a5b95524d70b371a6ee7e32b617df01413fca78" gracePeriod=600 Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.734100 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-576fl"] Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.735930 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.746743 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.746881 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zfcrv" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.747286 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.764034 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-576fl"] Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.771401 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bdxdm"] Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.774071 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.797031 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bdxdm"] Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.833841 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-log\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.833905 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9315fa04-bcf9-4013-be72-f29a5cf95f4e-scripts\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.833924 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-etc-ovs\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.833946 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcqtn\" (UniqueName: \"kubernetes.io/projected/9315fa04-bcf9-4013-be72-f29a5cf95f4e-kube-api-access-jcqtn\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834235 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65c370a7-5d69-437a-98d2-810e97b9a5b7-ovn-controller-tls-certs\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834353 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c370a7-5d69-437a-98d2-810e97b9a5b7-combined-ca-bundle\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834394 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-run\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834485 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-run\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834572 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-run-ovn\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834652 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-lib\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834790 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnc7\" (UniqueName: \"kubernetes.io/projected/65c370a7-5d69-437a-98d2-810e97b9a5b7-kube-api-access-mrnc7\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834861 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-log-ovn\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.834978 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65c370a7-5d69-437a-98d2-810e97b9a5b7-scripts\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936567 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnc7\" (UniqueName: \"kubernetes.io/projected/65c370a7-5d69-437a-98d2-810e97b9a5b7-kube-api-access-mrnc7\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936635 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-log-ovn\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936674 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65c370a7-5d69-437a-98d2-810e97b9a5b7-scripts\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936705 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-log\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936733 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9315fa04-bcf9-4013-be72-f29a5cf95f4e-scripts\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936753 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-etc-ovs\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936769 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcqtn\" (UniqueName: \"kubernetes.io/projected/9315fa04-bcf9-4013-be72-f29a5cf95f4e-kube-api-access-jcqtn\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936861 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65c370a7-5d69-437a-98d2-810e97b9a5b7-ovn-controller-tls-certs\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936885 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c370a7-5d69-437a-98d2-810e97b9a5b7-combined-ca-bundle\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936902 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-run\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936933 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-run\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936961 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-run-ovn\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.936983 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-lib\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.937677 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-lib\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.938461 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-log-ovn\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.941456 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65c370a7-5d69-437a-98d2-810e97b9a5b7-scripts\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.941596 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-log\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.943270 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9315fa04-bcf9-4013-be72-f29a5cf95f4e-scripts\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.943396 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-etc-ovs\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.944673 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-run\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.944686 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9315fa04-bcf9-4013-be72-f29a5cf95f4e-var-run\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.944880 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65c370a7-5d69-437a-98d2-810e97b9a5b7-var-run-ovn\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.954795 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65c370a7-5d69-437a-98d2-810e97b9a5b7-ovn-controller-tls-certs\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.957009 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c370a7-5d69-437a-98d2-810e97b9a5b7-combined-ca-bundle\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.960237 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnc7\" (UniqueName: \"kubernetes.io/projected/65c370a7-5d69-437a-98d2-810e97b9a5b7-kube-api-access-mrnc7\") pod \"ovn-controller-576fl\" (UID: \"65c370a7-5d69-437a-98d2-810e97b9a5b7\") " pod="openstack/ovn-controller-576fl" Nov 22 09:29:58 crc kubenswrapper[4846]: I1122 09:29:58.974932 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcqtn\" (UniqueName: \"kubernetes.io/projected/9315fa04-bcf9-4013-be72-f29a5cf95f4e-kube-api-access-jcqtn\") pod \"ovn-controller-ovs-bdxdm\" (UID: \"9315fa04-bcf9-4013-be72-f29a5cf95f4e\") " pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.026752 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.028160 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.030163 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.030637 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.030871 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.031025 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gdvjf" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.031202 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.054771 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.060416 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="bac90441ca960230e02742d36a5b95524d70b371a6ee7e32b617df01413fca78" exitCode=0 Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.060500 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"bac90441ca960230e02742d36a5b95524d70b371a6ee7e32b617df01413fca78"} Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.060559 4846 scope.go:117] "RemoveContainer" containerID="98f9262a8d10b551be9acdbca7c91a24b8c83945ea853c86e2932b08cb27780b" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.064749 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-576fl" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.091729 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140061 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7wrn\" (UniqueName: \"kubernetes.io/projected/a5c5e879-a8c6-4758-a577-00d371164c9d-kube-api-access-x7wrn\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140200 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140319 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140432 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140595 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140725 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a5c5e879-a8c6-4758-a577-00d371164c9d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140895 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5c5e879-a8c6-4758-a577-00d371164c9d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.140939 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c5e879-a8c6-4758-a577-00d371164c9d-config\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242687 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a5c5e879-a8c6-4758-a577-00d371164c9d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242787 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5c5e879-a8c6-4758-a577-00d371164c9d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242817 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c5e879-a8c6-4758-a577-00d371164c9d-config\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242834 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242855 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7wrn\" (UniqueName: \"kubernetes.io/projected/a5c5e879-a8c6-4758-a577-00d371164c9d-kube-api-access-x7wrn\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242878 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242905 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.242950 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.243594 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.244542 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5c5e879-a8c6-4758-a577-00d371164c9d-config\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.244599 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5c5e879-a8c6-4758-a577-00d371164c9d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.244878 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a5c5e879-a8c6-4758-a577-00d371164c9d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.249539 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.249784 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.250026 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c5e879-a8c6-4758-a577-00d371164c9d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.282804 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7wrn\" (UniqueName: \"kubernetes.io/projected/a5c5e879-a8c6-4758-a577-00d371164c9d-kube-api-access-x7wrn\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.302563 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5c5e879-a8c6-4758-a577-00d371164c9d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:29:59 crc kubenswrapper[4846]: I1122 09:29:59.354264 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.153209 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p"] Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.157699 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.180018 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.182322 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.195804 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p"] Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.267696 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea037c55-edbb-4c60-ab5e-7955eafa3139-secret-volume\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.267813 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea037c55-edbb-4c60-ab5e-7955eafa3139-config-volume\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.267868 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp9xh\" (UniqueName: \"kubernetes.io/projected/ea037c55-edbb-4c60-ab5e-7955eafa3139-kube-api-access-zp9xh\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.370155 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea037c55-edbb-4c60-ab5e-7955eafa3139-config-volume\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.370251 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp9xh\" (UniqueName: \"kubernetes.io/projected/ea037c55-edbb-4c60-ab5e-7955eafa3139-kube-api-access-zp9xh\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.370302 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea037c55-edbb-4c60-ab5e-7955eafa3139-secret-volume\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.371743 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea037c55-edbb-4c60-ab5e-7955eafa3139-config-volume\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.376148 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea037c55-edbb-4c60-ab5e-7955eafa3139-secret-volume\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.392180 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp9xh\" (UniqueName: \"kubernetes.io/projected/ea037c55-edbb-4c60-ab5e-7955eafa3139-kube-api-access-zp9xh\") pod \"collect-profiles-29396730-sf48p\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:00 crc kubenswrapper[4846]: I1122 09:30:00.478901 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.395920 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.400830 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.403079 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.403848 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9s8hc" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.404658 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.405116 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.415961 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445399 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf9n6\" (UniqueName: \"kubernetes.io/projected/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-kube-api-access-hf9n6\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445475 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445523 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445568 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445603 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-config\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445676 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445711 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.445746 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547556 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547641 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-config\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547677 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547709 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547735 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547817 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf9n6\" (UniqueName: \"kubernetes.io/projected/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-kube-api-access-hf9n6\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547860 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.547896 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.548328 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.549803 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.555794 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.556916 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-config\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.562726 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.564480 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.565270 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.575247 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.577530 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf9n6\" (UniqueName: \"kubernetes.io/projected/aff4ba43-41a2-420b-8f89-99c69c1f3cfc-kube-api-access-hf9n6\") pod \"ovsdbserver-sb-0\" (UID: \"aff4ba43-41a2-420b-8f89-99c69c1f3cfc\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:03 crc kubenswrapper[4846]: I1122 09:30:03.731698 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.438299 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.439021 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-86bfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-g6cfm_openstack(ac74ecfd-8981-4682-847e-b8c23742bfd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.440497 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.442716 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.442840 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6v6v2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-c9v68_openstack(51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.444358 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" podUID="51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.479131 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.479417 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdks8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jlzw6_openstack(0138b3cd-807f-4bd7-b6ae-ccf530df1faf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:30:06 crc kubenswrapper[4846]: E1122 09:30:06.481156 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" podUID="0138b3cd-807f-4bd7-b6ae-ccf530df1faf" Nov 22 09:30:06 crc kubenswrapper[4846]: I1122 09:30:06.991452 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.000318 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 09:30:07 crc kubenswrapper[4846]: W1122 09:30:07.002742 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d174fc1_bcf2_4812_9766_875d3ca3efe5.slice/crio-e8c1a69868211799091d9addbb2e0b066e7d4a49689f2b6ef2999f38a496ebeb WatchSource:0}: Error finding container e8c1a69868211799091d9addbb2e0b066e7d4a49689f2b6ef2999f38a496ebeb: Status 404 returned error can't find the container with id e8c1a69868211799091d9addbb2e0b066e7d4a49689f2b6ef2999f38a496ebeb Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.203243 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93cad534-86a5-4420-951f-859efc86a70a","Type":"ContainerStarted","Data":"ab28f33c8e803d67a47d3d74bbe906a1cc35b995e8497b4d57cee157e2320541"} Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.236196 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.244117 4846 generic.go:334] "Generic (PLEG): container finished" podID="a8cc2084-0252-4943-8f6e-c415924a222f" containerID="4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342" exitCode=0 Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.244411 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" event={"ID":"a8cc2084-0252-4943-8f6e-c415924a222f","Type":"ContainerDied","Data":"4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342"} Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.247086 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4d174fc1-bcf2-4812-9766-875d3ca3efe5","Type":"ContainerStarted","Data":"e8c1a69868211799091d9addbb2e0b066e7d4a49689f2b6ef2999f38a496ebeb"} Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.255369 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"cf9936e32ada96756d2d63284f53d35f1bafde25a492c2c86fd57715fcf497eb"} Nov 22 09:30:07 crc kubenswrapper[4846]: W1122 09:30:07.271017 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65c370a7_5d69_437a_98d2_810e97b9a5b7.slice/crio-21bfce529f39a01dcb6cefb766e8329e1bd22a134177520ad23c03eb1605a6a1 WatchSource:0}: Error finding container 21bfce529f39a01dcb6cefb766e8329e1bd22a134177520ad23c03eb1605a6a1: Status 404 returned error can't find the container with id 21bfce529f39a01dcb6cefb766e8329e1bd22a134177520ad23c03eb1605a6a1 Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.275516 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.288302 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-576fl"] Nov 22 09:30:07 crc kubenswrapper[4846]: W1122 09:30:07.304141 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod340633f3_603b_416e_924a_2938adbde84f.slice/crio-ea76cfa65bd78a07f8f76760e067eb45801acc4be042c74a50f835f838c4b8a5 WatchSource:0}: Error finding container ea76cfa65bd78a07f8f76760e067eb45801acc4be042c74a50f835f838c4b8a5: Status 404 returned error can't find the container with id ea76cfa65bd78a07f8f76760e067eb45801acc4be042c74a50f835f838c4b8a5 Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.390521 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p"] Nov 22 09:30:07 crc kubenswrapper[4846]: W1122 09:30:07.405343 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea037c55_edbb_4c60_ab5e_7955eafa3139.slice/crio-4057dc4b3bb35720f36a7e9f12ba5cd086c9e6ffb285ab4bac4e0fe0b92efee0 WatchSource:0}: Error finding container 4057dc4b3bb35720f36a7e9f12ba5cd086c9e6ffb285ab4bac4e0fe0b92efee0: Status 404 returned error can't find the container with id 4057dc4b3bb35720f36a7e9f12ba5cd086c9e6ffb285ab4bac4e0fe0b92efee0 Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.434215 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.530859 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 09:30:07 crc kubenswrapper[4846]: W1122 09:30:07.799006 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff4ba43_41a2_420b_8f89_99c69c1f3cfc.slice/crio-2b15879a32be5468c0f71b156810f5c303d5862a1daddd20c678c1c1dbb1065a WatchSource:0}: Error finding container 2b15879a32be5468c0f71b156810f5c303d5862a1daddd20c678c1c1dbb1065a: Status 404 returned error can't find the container with id 2b15879a32be5468c0f71b156810f5c303d5862a1daddd20c678c1c1dbb1065a Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.963630 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:30:07 crc kubenswrapper[4846]: I1122 09:30:07.971931 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.042210 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-dns-svc\") pod \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.042278 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-config\") pod \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.042315 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdks8\" (UniqueName: \"kubernetes.io/projected/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-kube-api-access-rdks8\") pod \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.042342 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v6v2\" (UniqueName: \"kubernetes.io/projected/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-kube-api-access-6v6v2\") pod \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\" (UID: \"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6\") " Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.042404 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-config\") pod \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\" (UID: \"0138b3cd-807f-4bd7-b6ae-ccf530df1faf\") " Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.043288 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-config" (OuterVolumeSpecName: "config") pod "0138b3cd-807f-4bd7-b6ae-ccf530df1faf" (UID: "0138b3cd-807f-4bd7-b6ae-ccf530df1faf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.043776 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-config" (OuterVolumeSpecName: "config") pod "51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6" (UID: "51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.043779 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6" (UID: "51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.051196 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-kube-api-access-rdks8" (OuterVolumeSpecName: "kube-api-access-rdks8") pod "0138b3cd-807f-4bd7-b6ae-ccf530df1faf" (UID: "0138b3cd-807f-4bd7-b6ae-ccf530df1faf"). InnerVolumeSpecName "kube-api-access-rdks8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.053720 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-kube-api-access-6v6v2" (OuterVolumeSpecName: "kube-api-access-6v6v2") pod "51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6" (UID: "51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6"). InnerVolumeSpecName "kube-api-access-6v6v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.144726 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.144773 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.144786 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdks8\" (UniqueName: \"kubernetes.io/projected/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-kube-api-access-rdks8\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.144798 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v6v2\" (UniqueName: \"kubernetes.io/projected/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6-kube-api-access-6v6v2\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.144811 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0138b3cd-807f-4bd7-b6ae-ccf530df1faf-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.267168 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aff4ba43-41a2-420b-8f89-99c69c1f3cfc","Type":"ContainerStarted","Data":"2b15879a32be5468c0f71b156810f5c303d5862a1daddd20c678c1c1dbb1065a"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.268639 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-576fl" event={"ID":"65c370a7-5d69-437a-98d2-810e97b9a5b7","Type":"ContainerStarted","Data":"21bfce529f39a01dcb6cefb766e8329e1bd22a134177520ad23c03eb1605a6a1"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.270660 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" event={"ID":"51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6","Type":"ContainerDied","Data":"654ea3c3d57cd03506d2969872b90230092dc27c366693ba71dc93fc01c58d20"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.270901 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c9v68" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.277892 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a5c5e879-a8c6-4758-a577-00d371164c9d","Type":"ContainerStarted","Data":"9a0b93db8dbd0da123f9404e712ff878b4d3b6af3e02e7e68137dd5d752fa805"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.280167 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" event={"ID":"ea037c55-edbb-4c60-ab5e-7955eafa3139","Type":"ContainerStarted","Data":"4057dc4b3bb35720f36a7e9f12ba5cd086c9e6ffb285ab4bac4e0fe0b92efee0"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.282323 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"340633f3-603b-416e-924a-2938adbde84f","Type":"ContainerStarted","Data":"ea76cfa65bd78a07f8f76760e067eb45801acc4be042c74a50f835f838c4b8a5"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.324134 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" event={"ID":"0138b3cd-807f-4bd7-b6ae-ccf530df1faf","Type":"ContainerDied","Data":"8d6b264506d6c2336c8b1a01b675d91dc2240cd61cdefc6f442988349af1f2b4"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.324247 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jlzw6" Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.332364 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c9v68"] Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.341719 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4e66c89-9999-4584-a149-2c18589a522a","Type":"ContainerStarted","Data":"751559a19c71b758747f2b98963586aca06c1ab54ec6943d320f174f9fe53fed"} Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.375936 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c9v68"] Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.404544 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jlzw6"] Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.422641 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jlzw6"] Nov 22 09:30:08 crc kubenswrapper[4846]: I1122 09:30:08.576081 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bdxdm"] Nov 22 09:30:09 crc kubenswrapper[4846]: I1122 09:30:09.354278 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdxdm" event={"ID":"9315fa04-bcf9-4013-be72-f29a5cf95f4e","Type":"ContainerStarted","Data":"24bcaa4574dfb6fb635eaf3c193634b426fa9d0df8a795f7f9757001b7379dcc"} Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.048409 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0138b3cd-807f-4bd7-b6ae-ccf530df1faf" path="/var/lib/kubelet/pods/0138b3cd-807f-4bd7-b6ae-ccf530df1faf/volumes" Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.049279 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6" path="/var/lib/kubelet/pods/51b8ef7c-1ba2-41f6-9184-5a86e5e1a7f6/volumes" Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.370139 4846 generic.go:334] "Generic (PLEG): container finished" podID="ea037c55-edbb-4c60-ab5e-7955eafa3139" containerID="3d0c0541f13a9fa342cfe9bc0ec1351012ca97fb222b4f938c53d635bd749675" exitCode=0 Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.370247 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" event={"ID":"ea037c55-edbb-4c60-ab5e-7955eafa3139","Type":"ContainerDied","Data":"3d0c0541f13a9fa342cfe9bc0ec1351012ca97fb222b4f938c53d635bd749675"} Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.373138 4846 generic.go:334] "Generic (PLEG): container finished" podID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerID="858662d87849c597693415f31b4a0bad265d4a926f02c1da0f689e2f7b6104e9" exitCode=0 Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.373183 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" event={"ID":"ac74ecfd-8981-4682-847e-b8c23742bfd0","Type":"ContainerDied","Data":"858662d87849c597693415f31b4a0bad265d4a926f02c1da0f689e2f7b6104e9"} Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.381067 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" event={"ID":"a8cc2084-0252-4943-8f6e-c415924a222f","Type":"ContainerStarted","Data":"6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9"} Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.382155 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.390117 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"899cf49d-9541-4f23-b1a2-887324973fb1","Type":"ContainerStarted","Data":"56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37"} Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.395094 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6","Type":"ContainerStarted","Data":"fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7"} Nov 22 09:30:10 crc kubenswrapper[4846]: I1122 09:30:10.429899 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" podStartSLOduration=5.111057387 podStartE2EDuration="21.429868457s" podCreationTimestamp="2025-11-22 09:29:49 +0000 UTC" firstStartedPulling="2025-11-22 09:29:50.217731846 +0000 UTC m=+965.153421495" lastFinishedPulling="2025-11-22 09:30:06.536542916 +0000 UTC m=+981.472232565" observedRunningTime="2025-11-22 09:30:10.423511651 +0000 UTC m=+985.359201300" watchObservedRunningTime="2025-11-22 09:30:10.429868457 +0000 UTC m=+985.365558116" Nov 22 09:30:11 crc kubenswrapper[4846]: I1122 09:30:11.410620 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" event={"ID":"ac74ecfd-8981-4682-847e-b8c23742bfd0","Type":"ContainerStarted","Data":"7e184037fabe79602b84b12a8df261859ca6627813657c26cd59c192726cafdd"} Nov 22 09:30:11 crc kubenswrapper[4846]: I1122 09:30:11.411869 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:30:11 crc kubenswrapper[4846]: I1122 09:30:11.439608 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" podStartSLOduration=-9223372014.41519 podStartE2EDuration="22.439586499s" podCreationTimestamp="2025-11-22 09:29:49 +0000 UTC" firstStartedPulling="2025-11-22 09:29:50.35611243 +0000 UTC m=+965.291802079" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:30:11.434441548 +0000 UTC m=+986.370131197" watchObservedRunningTime="2025-11-22 09:30:11.439586499 +0000 UTC m=+986.375276148" Nov 22 09:30:14 crc kubenswrapper[4846]: I1122 09:30:14.562477 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.553183 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.696552 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea037c55-edbb-4c60-ab5e-7955eafa3139-secret-volume\") pod \"ea037c55-edbb-4c60-ab5e-7955eafa3139\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.696646 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea037c55-edbb-4c60-ab5e-7955eafa3139-config-volume\") pod \"ea037c55-edbb-4c60-ab5e-7955eafa3139\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.696831 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp9xh\" (UniqueName: \"kubernetes.io/projected/ea037c55-edbb-4c60-ab5e-7955eafa3139-kube-api-access-zp9xh\") pod \"ea037c55-edbb-4c60-ab5e-7955eafa3139\" (UID: \"ea037c55-edbb-4c60-ab5e-7955eafa3139\") " Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.698732 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea037c55-edbb-4c60-ab5e-7955eafa3139-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea037c55-edbb-4c60-ab5e-7955eafa3139" (UID: "ea037c55-edbb-4c60-ab5e-7955eafa3139"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.706393 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea037c55-edbb-4c60-ab5e-7955eafa3139-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ea037c55-edbb-4c60-ab5e-7955eafa3139" (UID: "ea037c55-edbb-4c60-ab5e-7955eafa3139"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.712382 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea037c55-edbb-4c60-ab5e-7955eafa3139-kube-api-access-zp9xh" (OuterVolumeSpecName: "kube-api-access-zp9xh") pod "ea037c55-edbb-4c60-ab5e-7955eafa3139" (UID: "ea037c55-edbb-4c60-ab5e-7955eafa3139"). InnerVolumeSpecName "kube-api-access-zp9xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.798893 4846 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea037c55-edbb-4c60-ab5e-7955eafa3139-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.799268 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea037c55-edbb-4c60-ab5e-7955eafa3139-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:15 crc kubenswrapper[4846]: I1122 09:30:15.799280 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp9xh\" (UniqueName: \"kubernetes.io/projected/ea037c55-edbb-4c60-ab5e-7955eafa3139-kube-api-access-zp9xh\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:16 crc kubenswrapper[4846]: I1122 09:30:16.454006 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" event={"ID":"ea037c55-edbb-4c60-ab5e-7955eafa3139","Type":"ContainerDied","Data":"4057dc4b3bb35720f36a7e9f12ba5cd086c9e6ffb285ab4bac4e0fe0b92efee0"} Nov 22 09:30:16 crc kubenswrapper[4846]: I1122 09:30:16.454067 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4057dc4b3bb35720f36a7e9f12ba5cd086c9e6ffb285ab4bac4e0fe0b92efee0" Nov 22 09:30:16 crc kubenswrapper[4846]: I1122 09:30:16.454148 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p" Nov 22 09:30:17 crc kubenswrapper[4846]: I1122 09:30:17.477507 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93cad534-86a5-4420-951f-859efc86a70a","Type":"ContainerStarted","Data":"c762287466840969b5b1cfcd825b6ac207c7d66666908c26ecac0e4c10795ccd"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.488924 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a5c5e879-a8c6-4758-a577-00d371164c9d","Type":"ContainerStarted","Data":"bef14e8a6e6c2642539231371a6aa154fb6847cb8a3a172e46f8288d104c429d"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.492425 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"340633f3-603b-416e-924a-2938adbde84f","Type":"ContainerStarted","Data":"2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.492564 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.494552 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4e66c89-9999-4584-a149-2c18589a522a","Type":"ContainerStarted","Data":"bcb7c14516bdc06d107179732a7d38e5d8b86a425f18447721e6683f5bfe6668"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.497100 4846 generic.go:334] "Generic (PLEG): container finished" podID="9315fa04-bcf9-4013-be72-f29a5cf95f4e" containerID="fc1ebdae654f966c5eb94af2de2192422965a3c6f7cfe7f1e8a3a4a55c6de254" exitCode=0 Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.497155 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdxdm" event={"ID":"9315fa04-bcf9-4013-be72-f29a5cf95f4e","Type":"ContainerDied","Data":"fc1ebdae654f966c5eb94af2de2192422965a3c6f7cfe7f1e8a3a4a55c6de254"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.500584 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4d174fc1-bcf2-4812-9766-875d3ca3efe5","Type":"ContainerStarted","Data":"f3571b6d8fcd7f744265c44a1c11a586d72732bd4115d81af5c2782bf8249db8"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.501251 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.502927 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aff4ba43-41a2-420b-8f89-99c69c1f3cfc","Type":"ContainerStarted","Data":"537aa0f1a694eb188c40b3c8efe2710a84aecd433092bf1b08aa299b3a73c8d6"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.508065 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-576fl" event={"ID":"65c370a7-5d69-437a-98d2-810e97b9a5b7","Type":"ContainerStarted","Data":"7219f4ef03f7a943f4c22ec9c7913cfa0212c4896a0e721e1ebd15ee4d17ec6a"} Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.508121 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-576fl" Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.542134 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.544265256 podStartE2EDuration="23.538006759s" podCreationTimestamp="2025-11-22 09:29:55 +0000 UTC" firstStartedPulling="2025-11-22 09:30:07.31083321 +0000 UTC m=+982.246522859" lastFinishedPulling="2025-11-22 09:30:17.304574693 +0000 UTC m=+992.240264362" observedRunningTime="2025-11-22 09:30:18.512852022 +0000 UTC m=+993.448541671" watchObservedRunningTime="2025-11-22 09:30:18.538006759 +0000 UTC m=+993.473696408" Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.566876 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-576fl" podStartSLOduration=11.067277838 podStartE2EDuration="20.566851384s" podCreationTimestamp="2025-11-22 09:29:58 +0000 UTC" firstStartedPulling="2025-11-22 09:30:07.287544428 +0000 UTC m=+982.223234077" lastFinishedPulling="2025-11-22 09:30:16.787117954 +0000 UTC m=+991.722807623" observedRunningTime="2025-11-22 09:30:18.554944115 +0000 UTC m=+993.490633754" watchObservedRunningTime="2025-11-22 09:30:18.566851384 +0000 UTC m=+993.502541033" Nov 22 09:30:18 crc kubenswrapper[4846]: I1122 09:30:18.578387 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.983388 podStartE2EDuration="25.578359451s" podCreationTimestamp="2025-11-22 09:29:53 +0000 UTC" firstStartedPulling="2025-11-22 09:30:07.006190765 +0000 UTC m=+981.941880414" lastFinishedPulling="2025-11-22 09:30:16.601162226 +0000 UTC m=+991.536851865" observedRunningTime="2025-11-22 09:30:18.575498847 +0000 UTC m=+993.511188516" watchObservedRunningTime="2025-11-22 09:30:18.578359451 +0000 UTC m=+993.514049090" Nov 22 09:30:19 crc kubenswrapper[4846]: I1122 09:30:19.518981 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdxdm" event={"ID":"9315fa04-bcf9-4013-be72-f29a5cf95f4e","Type":"ContainerStarted","Data":"b746c5f9711cc67b165e8b17a6783a274bcd8e6a0a654e02472003d8cf54fd66"} Nov 22 09:30:19 crc kubenswrapper[4846]: I1122 09:30:19.519696 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bdxdm" event={"ID":"9315fa04-bcf9-4013-be72-f29a5cf95f4e","Type":"ContainerStarted","Data":"10ad40840a6f91629e441a6186cf3dccc47ba9c1d26ecd85aa0c953ce2a82a65"} Nov 22 09:30:19 crc kubenswrapper[4846]: I1122 09:30:19.900913 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:30:19 crc kubenswrapper[4846]: I1122 09:30:19.930818 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bdxdm" podStartSLOduration=13.913099771 podStartE2EDuration="21.930783403s" podCreationTimestamp="2025-11-22 09:29:58 +0000 UTC" firstStartedPulling="2025-11-22 09:30:08.585600897 +0000 UTC m=+983.521290546" lastFinishedPulling="2025-11-22 09:30:16.603284529 +0000 UTC m=+991.538974178" observedRunningTime="2025-11-22 09:30:19.5432613 +0000 UTC m=+994.478950969" watchObservedRunningTime="2025-11-22 09:30:19.930783403 +0000 UTC m=+994.866473052" Nov 22 09:30:19 crc kubenswrapper[4846]: I1122 09:30:19.955192 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mccpx"] Nov 22 09:30:19 crc kubenswrapper[4846]: I1122 09:30:19.956341 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" containerName="dnsmasq-dns" containerID="cri-o://6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9" gracePeriod=10 Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.517747 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551182 4846 generic.go:334] "Generic (PLEG): container finished" podID="a8cc2084-0252-4943-8f6e-c415924a222f" containerID="6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9" exitCode=0 Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551320 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" event={"ID":"a8cc2084-0252-4943-8f6e-c415924a222f","Type":"ContainerDied","Data":"6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9"} Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551408 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" event={"ID":"a8cc2084-0252-4943-8f6e-c415924a222f","Type":"ContainerDied","Data":"32568a6986bb1501dd7d2501a23f92c527c376249deb3ee424c8d009d78ec272"} Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551351 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-mccpx" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551430 4846 scope.go:117] "RemoveContainer" containerID="6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551693 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.551733 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.600177 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-dns-svc\") pod \"a8cc2084-0252-4943-8f6e-c415924a222f\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.600272 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7zhh\" (UniqueName: \"kubernetes.io/projected/a8cc2084-0252-4943-8f6e-c415924a222f-kube-api-access-b7zhh\") pod \"a8cc2084-0252-4943-8f6e-c415924a222f\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.600572 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-config\") pod \"a8cc2084-0252-4943-8f6e-c415924a222f\" (UID: \"a8cc2084-0252-4943-8f6e-c415924a222f\") " Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.616405 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cc2084-0252-4943-8f6e-c415924a222f-kube-api-access-b7zhh" (OuterVolumeSpecName: "kube-api-access-b7zhh") pod "a8cc2084-0252-4943-8f6e-c415924a222f" (UID: "a8cc2084-0252-4943-8f6e-c415924a222f"). InnerVolumeSpecName "kube-api-access-b7zhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.638084 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8cc2084-0252-4943-8f6e-c415924a222f" (UID: "a8cc2084-0252-4943-8f6e-c415924a222f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.638844 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-config" (OuterVolumeSpecName: "config") pod "a8cc2084-0252-4943-8f6e-c415924a222f" (UID: "a8cc2084-0252-4943-8f6e-c415924a222f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.703413 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7zhh\" (UniqueName: \"kubernetes.io/projected/a8cc2084-0252-4943-8f6e-c415924a222f-kube-api-access-b7zhh\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.703898 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.703909 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8cc2084-0252-4943-8f6e-c415924a222f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.767523 4846 scope.go:117] "RemoveContainer" containerID="4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.847230 4846 scope.go:117] "RemoveContainer" containerID="6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9" Nov 22 09:30:20 crc kubenswrapper[4846]: E1122 09:30:20.847801 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9\": container with ID starting with 6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9 not found: ID does not exist" containerID="6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.847857 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9"} err="failed to get container status \"6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9\": rpc error: code = NotFound desc = could not find container \"6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9\": container with ID starting with 6aeb5204ad277be8f74883b45d1ac02debe681969f2d5b087a95fe56468b6dd9 not found: ID does not exist" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.847887 4846 scope.go:117] "RemoveContainer" containerID="4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342" Nov 22 09:30:20 crc kubenswrapper[4846]: E1122 09:30:20.849291 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342\": container with ID starting with 4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342 not found: ID does not exist" containerID="4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.849345 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342"} err="failed to get container status \"4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342\": rpc error: code = NotFound desc = could not find container \"4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342\": container with ID starting with 4827df20b6d3efadb2885be0639794d8002657503010a20104e14aa19a597342 not found: ID does not exist" Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.896742 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mccpx"] Nov 22 09:30:20 crc kubenswrapper[4846]: I1122 09:30:20.918475 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-mccpx"] Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.567290 4846 generic.go:334] "Generic (PLEG): container finished" podID="e4e66c89-9999-4584-a149-2c18589a522a" containerID="bcb7c14516bdc06d107179732a7d38e5d8b86a425f18447721e6683f5bfe6668" exitCode=0 Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.567435 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4e66c89-9999-4584-a149-2c18589a522a","Type":"ContainerDied","Data":"bcb7c14516bdc06d107179732a7d38e5d8b86a425f18447721e6683f5bfe6668"} Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.570586 4846 generic.go:334] "Generic (PLEG): container finished" podID="93cad534-86a5-4420-951f-859efc86a70a" containerID="c762287466840969b5b1cfcd825b6ac207c7d66666908c26ecac0e4c10795ccd" exitCode=0 Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.570665 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93cad534-86a5-4420-951f-859efc86a70a","Type":"ContainerDied","Data":"c762287466840969b5b1cfcd825b6ac207c7d66666908c26ecac0e4c10795ccd"} Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.575575 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aff4ba43-41a2-420b-8f89-99c69c1f3cfc","Type":"ContainerStarted","Data":"705d78eb198a562f8d9d40dff2d6217b544018c590592820f78b658c41befa5b"} Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.580358 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a5c5e879-a8c6-4758-a577-00d371164c9d","Type":"ContainerStarted","Data":"fcbbdebc541c7bccd4e1d1313b473954ddd0e38308aa266cd47aa4b1eb63ae33"} Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.653904 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.324819997 podStartE2EDuration="24.653881814s" podCreationTimestamp="2025-11-22 09:29:57 +0000 UTC" firstStartedPulling="2025-11-22 09:30:07.520276076 +0000 UTC m=+982.455965725" lastFinishedPulling="2025-11-22 09:30:20.849337883 +0000 UTC m=+995.785027542" observedRunningTime="2025-11-22 09:30:21.651095322 +0000 UTC m=+996.586784961" watchObservedRunningTime="2025-11-22 09:30:21.653881814 +0000 UTC m=+996.589571463" Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.691688 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.649281803 podStartE2EDuration="19.691659481s" podCreationTimestamp="2025-11-22 09:30:02 +0000 UTC" firstStartedPulling="2025-11-22 09:30:07.805916844 +0000 UTC m=+982.741606493" lastFinishedPulling="2025-11-22 09:30:20.848294522 +0000 UTC m=+995.783984171" observedRunningTime="2025-11-22 09:30:21.686605212 +0000 UTC m=+996.622294881" watchObservedRunningTime="2025-11-22 09:30:21.691659481 +0000 UTC m=+996.627349140" Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.733766 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:21 crc kubenswrapper[4846]: I1122 09:30:21.873472 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.044909 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" path="/var/lib/kubelet/pods/a8cc2084-0252-4943-8f6e-c415924a222f/volumes" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.593436 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93cad534-86a5-4420-951f-859efc86a70a","Type":"ContainerStarted","Data":"1387ce5dc5b28bfaa9406761081be1d993e5b067af16d51c3544397d8538063a"} Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.599715 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4e66c89-9999-4584-a149-2c18589a522a","Type":"ContainerStarted","Data":"defae3eb5ca979fe179cb2a480c4d9b933113d9829304dccfa1bac46ca4c54fb"} Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.600325 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.631008 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.154243352 podStartE2EDuration="32.63098364s" podCreationTimestamp="2025-11-22 09:29:50 +0000 UTC" firstStartedPulling="2025-11-22 09:30:06.993870684 +0000 UTC m=+981.929560333" lastFinishedPulling="2025-11-22 09:30:16.470610962 +0000 UTC m=+991.406300621" observedRunningTime="2025-11-22 09:30:22.62382194 +0000 UTC m=+997.559511599" watchObservedRunningTime="2025-11-22 09:30:22.63098364 +0000 UTC m=+997.566673299" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.650622 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.135114992 podStartE2EDuration="30.650600914s" podCreationTimestamp="2025-11-22 09:29:52 +0000 UTC" firstStartedPulling="2025-11-22 09:30:07.27122532 +0000 UTC m=+982.206914969" lastFinishedPulling="2025-11-22 09:30:16.786711242 +0000 UTC m=+991.722400891" observedRunningTime="2025-11-22 09:30:22.646986749 +0000 UTC m=+997.582676408" watchObservedRunningTime="2025-11-22 09:30:22.650600914 +0000 UTC m=+997.586290573" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.657947 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.961539 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhf74"] Nov 22 09:30:22 crc kubenswrapper[4846]: E1122 09:30:22.961988 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea037c55-edbb-4c60-ab5e-7955eafa3139" containerName="collect-profiles" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.962010 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea037c55-edbb-4c60-ab5e-7955eafa3139" containerName="collect-profiles" Nov 22 09:30:22 crc kubenswrapper[4846]: E1122 09:30:22.962036 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" containerName="init" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.962062 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" containerName="init" Nov 22 09:30:22 crc kubenswrapper[4846]: E1122 09:30:22.962080 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" containerName="dnsmasq-dns" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.962090 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" containerName="dnsmasq-dns" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.962281 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8cc2084-0252-4943-8f6e-c415924a222f" containerName="dnsmasq-dns" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.962302 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea037c55-edbb-4c60-ab5e-7955eafa3139" containerName="collect-profiles" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.963559 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.966345 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 22 09:30:22 crc kubenswrapper[4846]: I1122 09:30:22.975714 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhf74"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.012264 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bnq8b"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.013391 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.015947 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.029954 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bnq8b"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052394 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b3f55d-b10e-40f1-9d45-4ed801491f54-config\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052476 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052526 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/80b3f55d-b10e-40f1-9d45-4ed801491f54-ovn-rundir\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052545 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b3f55d-b10e-40f1-9d45-4ed801491f54-combined-ca-bundle\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052579 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtm5t\" (UniqueName: \"kubernetes.io/projected/76429f86-1efd-47a4-8683-cf2e95194792-kube-api-access-dtm5t\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052704 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4jdr\" (UniqueName: \"kubernetes.io/projected/80b3f55d-b10e-40f1-9d45-4ed801491f54-kube-api-access-k4jdr\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052736 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/80b3f55d-b10e-40f1-9d45-4ed801491f54-ovs-rundir\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052768 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-config\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.052978 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b3f55d-b10e-40f1-9d45-4ed801491f54-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.053131 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.155707 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/80b3f55d-b10e-40f1-9d45-4ed801491f54-ovn-rundir\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.155809 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b3f55d-b10e-40f1-9d45-4ed801491f54-combined-ca-bundle\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.155863 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtm5t\" (UniqueName: \"kubernetes.io/projected/76429f86-1efd-47a4-8683-cf2e95194792-kube-api-access-dtm5t\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.155909 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4jdr\" (UniqueName: \"kubernetes.io/projected/80b3f55d-b10e-40f1-9d45-4ed801491f54-kube-api-access-k4jdr\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.155949 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/80b3f55d-b10e-40f1-9d45-4ed801491f54-ovs-rundir\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.155988 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-config\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.156034 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b3f55d-b10e-40f1-9d45-4ed801491f54-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.156101 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.156141 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b3f55d-b10e-40f1-9d45-4ed801491f54-config\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.156145 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/80b3f55d-b10e-40f1-9d45-4ed801491f54-ovs-rundir\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.156193 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.156090 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/80b3f55d-b10e-40f1-9d45-4ed801491f54-ovn-rundir\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.157323 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-config\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.157385 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.157397 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.158169 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b3f55d-b10e-40f1-9d45-4ed801491f54-config\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.163394 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b3f55d-b10e-40f1-9d45-4ed801491f54-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.169776 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b3f55d-b10e-40f1-9d45-4ed801491f54-combined-ca-bundle\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.172408 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtm5t\" (UniqueName: \"kubernetes.io/projected/76429f86-1efd-47a4-8683-cf2e95194792-kube-api-access-dtm5t\") pod \"dnsmasq-dns-7f896c8c65-zhf74\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.177809 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4jdr\" (UniqueName: \"kubernetes.io/projected/80b3f55d-b10e-40f1-9d45-4ed801491f54-kube-api-access-k4jdr\") pod \"ovn-controller-metrics-bnq8b\" (UID: \"80b3f55d-b10e-40f1-9d45-4ed801491f54\") " pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.310126 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhf74"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.310864 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.335577 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmhkl"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.337654 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.342488 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.352756 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmhkl"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.353189 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bnq8b" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.356510 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.461677 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.461762 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.461817 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stl9r\" (UniqueName: \"kubernetes.io/projected/5ee1c219-1f23-4236-a364-84d9dc62d9a5-kube-api-access-stl9r\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.461859 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-config\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.461881 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.486179 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.563470 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.563906 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stl9r\" (UniqueName: \"kubernetes.io/projected/5ee1c219-1f23-4236-a364-84d9dc62d9a5-kube-api-access-stl9r\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.563955 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-config\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.563976 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.564061 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.565347 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.565628 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.566006 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-config\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.566460 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.596884 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stl9r\" (UniqueName: \"kubernetes.io/projected/5ee1c219-1f23-4236-a364-84d9dc62d9a5-kube-api-access-stl9r\") pod \"dnsmasq-dns-86db49b7ff-xmhkl\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.610905 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.642337 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.642407 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.659944 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.768705 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.807757 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.812576 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.816763 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.817024 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.818100 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.820676 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vcrn7" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.823484 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.836404 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhf74"] Nov 22 09:30:23 crc kubenswrapper[4846]: W1122 09:30:23.859103 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76429f86_1efd_47a4_8683_cf2e95194792.slice/crio-79362ce33fdac4f0b949d749d133eb36b0ef3fc6af247824f2f031290881ec39 WatchSource:0}: Error finding container 79362ce33fdac4f0b949d749d133eb36b0ef3fc6af247824f2f031290881ec39: Status 404 returned error can't find the container with id 79362ce33fdac4f0b949d749d133eb36b0ef3fc6af247824f2f031290881ec39 Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869513 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-scripts\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869588 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnftv\" (UniqueName: \"kubernetes.io/projected/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-kube-api-access-hnftv\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869633 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869673 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869727 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-config\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869745 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.869764 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.902172 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bnq8b"] Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.972416 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnftv\" (UniqueName: \"kubernetes.io/projected/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-kube-api-access-hnftv\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.972801 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.972880 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.973293 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-config\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.973314 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.973334 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.973377 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-scripts\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.976632 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-config\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.976712 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-scripts\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.977921 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.983117 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.984353 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.984504 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:23 crc kubenswrapper[4846]: I1122 09:30:23.990371 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnftv\" (UniqueName: \"kubernetes.io/projected/fa80bcbe-b4a6-4515-b366-9ba9b0d92440-kube-api-access-hnftv\") pod \"ovn-northd-0\" (UID: \"fa80bcbe-b4a6-4515-b366-9ba9b0d92440\") " pod="openstack/ovn-northd-0" Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.006169 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.139790 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.318475 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmhkl"] Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.443982 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 09:30:24 crc kubenswrapper[4846]: W1122 09:30:24.447423 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa80bcbe_b4a6_4515_b366_9ba9b0d92440.slice/crio-de0518c85889eef377a054adfae2eb664fb040171ea5d4dabd3a4cf8a0575053 WatchSource:0}: Error finding container de0518c85889eef377a054adfae2eb664fb040171ea5d4dabd3a4cf8a0575053: Status 404 returned error can't find the container with id de0518c85889eef377a054adfae2eb664fb040171ea5d4dabd3a4cf8a0575053 Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.620895 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bnq8b" event={"ID":"80b3f55d-b10e-40f1-9d45-4ed801491f54","Type":"ContainerStarted","Data":"e68e23218439fa76581b909524a064da934f4378e7ea88786b81b91d3d6bcb7e"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.621416 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bnq8b" event={"ID":"80b3f55d-b10e-40f1-9d45-4ed801491f54","Type":"ContainerStarted","Data":"9b8cf9e0d1cd8821306884fcfa8013993bcd5868323c700b264a0a4d21c10b2b"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.622622 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" event={"ID":"5ee1c219-1f23-4236-a364-84d9dc62d9a5","Type":"ContainerStarted","Data":"20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.622644 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" event={"ID":"5ee1c219-1f23-4236-a364-84d9dc62d9a5","Type":"ContainerStarted","Data":"ea3d74cc302611d7e7d5556a036268ddea2f1999bff3d6178905739370b41244"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.625016 4846 generic.go:334] "Generic (PLEG): container finished" podID="76429f86-1efd-47a4-8683-cf2e95194792" containerID="a9be8dd357b310d027f1b6f4a9da499650bd87cadbcda4d082b8adb1635d08f4" exitCode=0 Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.625236 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" event={"ID":"76429f86-1efd-47a4-8683-cf2e95194792","Type":"ContainerDied","Data":"a9be8dd357b310d027f1b6f4a9da499650bd87cadbcda4d082b8adb1635d08f4"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.625307 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" event={"ID":"76429f86-1efd-47a4-8683-cf2e95194792","Type":"ContainerStarted","Data":"79362ce33fdac4f0b949d749d133eb36b0ef3fc6af247824f2f031290881ec39"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.627886 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fa80bcbe-b4a6-4515-b366-9ba9b0d92440","Type":"ContainerStarted","Data":"de0518c85889eef377a054adfae2eb664fb040171ea5d4dabd3a4cf8a0575053"} Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.642988 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bnq8b" podStartSLOduration=2.642964644 podStartE2EDuration="2.642964644s" podCreationTimestamp="2025-11-22 09:30:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:30:24.638896634 +0000 UTC m=+999.574586283" watchObservedRunningTime="2025-11-22 09:30:24.642964644 +0000 UTC m=+999.578654293" Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.913624 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.996539 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-ovsdbserver-sb\") pod \"76429f86-1efd-47a4-8683-cf2e95194792\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.997002 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-config\") pod \"76429f86-1efd-47a4-8683-cf2e95194792\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.997146 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtm5t\" (UniqueName: \"kubernetes.io/projected/76429f86-1efd-47a4-8683-cf2e95194792-kube-api-access-dtm5t\") pod \"76429f86-1efd-47a4-8683-cf2e95194792\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " Nov 22 09:30:24 crc kubenswrapper[4846]: I1122 09:30:24.997341 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-dns-svc\") pod \"76429f86-1efd-47a4-8683-cf2e95194792\" (UID: \"76429f86-1efd-47a4-8683-cf2e95194792\") " Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.003923 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76429f86-1efd-47a4-8683-cf2e95194792-kube-api-access-dtm5t" (OuterVolumeSpecName: "kube-api-access-dtm5t") pod "76429f86-1efd-47a4-8683-cf2e95194792" (UID: "76429f86-1efd-47a4-8683-cf2e95194792"). InnerVolumeSpecName "kube-api-access-dtm5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.020429 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76429f86-1efd-47a4-8683-cf2e95194792" (UID: "76429f86-1efd-47a4-8683-cf2e95194792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.023908 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-config" (OuterVolumeSpecName: "config") pod "76429f86-1efd-47a4-8683-cf2e95194792" (UID: "76429f86-1efd-47a4-8683-cf2e95194792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.024924 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "76429f86-1efd-47a4-8683-cf2e95194792" (UID: "76429f86-1efd-47a4-8683-cf2e95194792"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.100369 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.102187 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtm5t\" (UniqueName: \"kubernetes.io/projected/76429f86-1efd-47a4-8683-cf2e95194792-kube-api-access-dtm5t\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.102602 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.102618 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76429f86-1efd-47a4-8683-cf2e95194792-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.643378 4846 generic.go:334] "Generic (PLEG): container finished" podID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerID="20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88" exitCode=0 Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.643474 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" event={"ID":"5ee1c219-1f23-4236-a364-84d9dc62d9a5","Type":"ContainerDied","Data":"20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88"} Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.645844 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.651194 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-zhf74" event={"ID":"76429f86-1efd-47a4-8683-cf2e95194792","Type":"ContainerDied","Data":"79362ce33fdac4f0b949d749d133eb36b0ef3fc6af247824f2f031290881ec39"} Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.651266 4846 scope.go:117] "RemoveContainer" containerID="a9be8dd357b310d027f1b6f4a9da499650bd87cadbcda4d082b8adb1635d08f4" Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.839038 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhf74"] Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.862474 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-zhf74"] Nov 22 09:30:25 crc kubenswrapper[4846]: I1122 09:30:25.954875 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.105282 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76429f86-1efd-47a4-8683-cf2e95194792" path="/var/lib/kubelet/pods/76429f86-1efd-47a4-8683-cf2e95194792/volumes" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.114008 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmhkl"] Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.118163 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jfr65"] Nov 22 09:30:26 crc kubenswrapper[4846]: E1122 09:30:26.118657 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76429f86-1efd-47a4-8683-cf2e95194792" containerName="init" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.118671 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="76429f86-1efd-47a4-8683-cf2e95194792" containerName="init" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.118915 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="76429f86-1efd-47a4-8683-cf2e95194792" containerName="init" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.120863 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.155313 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jfr65"] Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.227403 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtwv\" (UniqueName: \"kubernetes.io/projected/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-kube-api-access-mqtwv\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.227888 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-config\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.227970 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-dns-svc\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.228013 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.228069 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.329599 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.329714 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtwv\" (UniqueName: \"kubernetes.io/projected/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-kube-api-access-mqtwv\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.329767 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-config\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.329821 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-dns-svc\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.329851 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.330945 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.331543 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.332089 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-config\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.332206 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-dns-svc\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.353820 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtwv\" (UniqueName: \"kubernetes.io/projected/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-kube-api-access-mqtwv\") pod \"dnsmasq-dns-698758b865-jfr65\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.468885 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.667841 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fa80bcbe-b4a6-4515-b366-9ba9b0d92440","Type":"ContainerStarted","Data":"8551cda46d75eaa7d37c8a53dd8035a16bcb6d9c6e047fc33fea6e75f1bfcfc0"} Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.668706 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fa80bcbe-b4a6-4515-b366-9ba9b0d92440","Type":"ContainerStarted","Data":"59d3dce16c48a79f0b9cd6d34e832ac27fc18a358b04d69a37bd8af9be2bdc88"} Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.668860 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.677982 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" event={"ID":"5ee1c219-1f23-4236-a364-84d9dc62d9a5","Type":"ContainerStarted","Data":"39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff"} Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.678534 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.690645 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.517253908 podStartE2EDuration="3.690613553s" podCreationTimestamp="2025-11-22 09:30:23 +0000 UTC" firstStartedPulling="2025-11-22 09:30:24.450907757 +0000 UTC m=+999.386597406" lastFinishedPulling="2025-11-22 09:30:25.624267382 +0000 UTC m=+1000.559957051" observedRunningTime="2025-11-22 09:30:26.6905221 +0000 UTC m=+1001.626211749" watchObservedRunningTime="2025-11-22 09:30:26.690613553 +0000 UTC m=+1001.626303202" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.729977 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" podStartSLOduration=3.729950195 podStartE2EDuration="3.729950195s" podCreationTimestamp="2025-11-22 09:30:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:30:26.720295102 +0000 UTC m=+1001.655984751" watchObservedRunningTime="2025-11-22 09:30:26.729950195 +0000 UTC m=+1001.665639834" Nov 22 09:30:26 crc kubenswrapper[4846]: I1122 09:30:26.958526 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jfr65"] Nov 22 09:30:26 crc kubenswrapper[4846]: W1122 09:30:26.981532 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35af4ff6_1042_4f6d_93da_bfb4f43fd04d.slice/crio-e611e28932f348fc935c333fb1c09099ebe592f6773d0ecc51ba462b8889c067 WatchSource:0}: Error finding container e611e28932f348fc935c333fb1c09099ebe592f6773d0ecc51ba462b8889c067: Status 404 returned error can't find the container with id e611e28932f348fc935c333fb1c09099ebe592f6773d0ecc51ba462b8889c067 Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.189101 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.200311 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.202469 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-prswb" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.202853 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.203116 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.204647 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.221885 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.263110 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/575c6d2b-ae18-48ec-a314-211ccd078d87-lock\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.263178 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.263222 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.263315 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/575c6d2b-ae18-48ec-a314-211ccd078d87-cache\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.263649 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhjw2\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-kube-api-access-zhjw2\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.366103 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.366225 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/575c6d2b-ae18-48ec-a314-211ccd078d87-cache\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.366272 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhjw2\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-kube-api-access-zhjw2\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.366354 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/575c6d2b-ae18-48ec-a314-211ccd078d87-lock\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.366378 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: E1122 09:30:27.366581 4846 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 09:30:27 crc kubenswrapper[4846]: E1122 09:30:27.366602 4846 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 09:30:27 crc kubenswrapper[4846]: E1122 09:30:27.366667 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift podName:575c6d2b-ae18-48ec-a314-211ccd078d87 nodeName:}" failed. No retries permitted until 2025-11-22 09:30:27.866643348 +0000 UTC m=+1002.802332997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift") pod "swift-storage-0" (UID: "575c6d2b-ae18-48ec-a314-211ccd078d87") : configmap "swift-ring-files" not found Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.367368 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.368239 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/575c6d2b-ae18-48ec-a314-211ccd078d87-cache\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.369029 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/575c6d2b-ae18-48ec-a314-211ccd078d87-lock\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.388076 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhjw2\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-kube-api-access-zhjw2\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.391114 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.690036 4846 generic.go:334] "Generic (PLEG): container finished" podID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerID="3a254155271141f405240411c56b83b79ce7e9c56eba71fd3691f8b9caa7d78f" exitCode=0 Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.690096 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jfr65" event={"ID":"35af4ff6-1042-4f6d-93da-bfb4f43fd04d","Type":"ContainerDied","Data":"3a254155271141f405240411c56b83b79ce7e9c56eba71fd3691f8b9caa7d78f"} Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.690162 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jfr65" event={"ID":"35af4ff6-1042-4f6d-93da-bfb4f43fd04d","Type":"ContainerStarted","Data":"e611e28932f348fc935c333fb1c09099ebe592f6773d0ecc51ba462b8889c067"} Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.690310 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerName="dnsmasq-dns" containerID="cri-o://39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff" gracePeriod=10 Nov 22 09:30:27 crc kubenswrapper[4846]: I1122 09:30:27.876271 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:27 crc kubenswrapper[4846]: E1122 09:30:27.876547 4846 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 09:30:27 crc kubenswrapper[4846]: E1122 09:30:27.876852 4846 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 09:30:27 crc kubenswrapper[4846]: E1122 09:30:27.876992 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift podName:575c6d2b-ae18-48ec-a314-211ccd078d87 nodeName:}" failed. No retries permitted until 2025-11-22 09:30:28.876965818 +0000 UTC m=+1003.812655467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift") pod "swift-storage-0" (UID: "575c6d2b-ae18-48ec-a314-211ccd078d87") : configmap "swift-ring-files" not found Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.069817 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.181748 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-config\") pod \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.181817 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stl9r\" (UniqueName: \"kubernetes.io/projected/5ee1c219-1f23-4236-a364-84d9dc62d9a5-kube-api-access-stl9r\") pod \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.181889 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-dns-svc\") pod \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.181924 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-sb\") pod \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.181978 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-nb\") pod \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\" (UID: \"5ee1c219-1f23-4236-a364-84d9dc62d9a5\") " Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.187443 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee1c219-1f23-4236-a364-84d9dc62d9a5-kube-api-access-stl9r" (OuterVolumeSpecName: "kube-api-access-stl9r") pod "5ee1c219-1f23-4236-a364-84d9dc62d9a5" (UID: "5ee1c219-1f23-4236-a364-84d9dc62d9a5"). InnerVolumeSpecName "kube-api-access-stl9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.228944 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ee1c219-1f23-4236-a364-84d9dc62d9a5" (UID: "5ee1c219-1f23-4236-a364-84d9dc62d9a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.232387 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ee1c219-1f23-4236-a364-84d9dc62d9a5" (UID: "5ee1c219-1f23-4236-a364-84d9dc62d9a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.240239 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ee1c219-1f23-4236-a364-84d9dc62d9a5" (UID: "5ee1c219-1f23-4236-a364-84d9dc62d9a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.265683 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-config" (OuterVolumeSpecName: "config") pod "5ee1c219-1f23-4236-a364-84d9dc62d9a5" (UID: "5ee1c219-1f23-4236-a364-84d9dc62d9a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.286338 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.286385 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.286399 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stl9r\" (UniqueName: \"kubernetes.io/projected/5ee1c219-1f23-4236-a364-84d9dc62d9a5-kube-api-access-stl9r\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.286411 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.286423 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1c219-1f23-4236-a364-84d9dc62d9a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.701803 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jfr65" event={"ID":"35af4ff6-1042-4f6d-93da-bfb4f43fd04d","Type":"ContainerStarted","Data":"e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d"} Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.702023 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.704894 4846 generic.go:334] "Generic (PLEG): container finished" podID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerID="39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff" exitCode=0 Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.704973 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" event={"ID":"5ee1c219-1f23-4236-a364-84d9dc62d9a5","Type":"ContainerDied","Data":"39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff"} Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.705010 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" event={"ID":"5ee1c219-1f23-4236-a364-84d9dc62d9a5","Type":"ContainerDied","Data":"ea3d74cc302611d7e7d5556a036268ddea2f1999bff3d6178905739370b41244"} Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.705066 4846 scope.go:117] "RemoveContainer" containerID="39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.705281 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xmhkl" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.726603 4846 scope.go:117] "RemoveContainer" containerID="20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.729861 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jfr65" podStartSLOduration=2.729838035 podStartE2EDuration="2.729838035s" podCreationTimestamp="2025-11-22 09:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:30:28.726841427 +0000 UTC m=+1003.662531076" watchObservedRunningTime="2025-11-22 09:30:28.729838035 +0000 UTC m=+1003.665527694" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.756929 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmhkl"] Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.764609 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xmhkl"] Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.770277 4846 scope.go:117] "RemoveContainer" containerID="39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff" Nov 22 09:30:28 crc kubenswrapper[4846]: E1122 09:30:28.771219 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff\": container with ID starting with 39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff not found: ID does not exist" containerID="39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.771285 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff"} err="failed to get container status \"39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff\": rpc error: code = NotFound desc = could not find container \"39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff\": container with ID starting with 39835697e94bc599f52de5b4e3c3f33ac471c28498a8f15a8213c0e2f93e02ff not found: ID does not exist" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.771326 4846 scope.go:117] "RemoveContainer" containerID="20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88" Nov 22 09:30:28 crc kubenswrapper[4846]: E1122 09:30:28.771698 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88\": container with ID starting with 20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88 not found: ID does not exist" containerID="20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.771737 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88"} err="failed to get container status \"20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88\": rpc error: code = NotFound desc = could not find container \"20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88\": container with ID starting with 20740c52c6507976668a5e7962ea111efc158d48287321c5c43cd5582e036c88 not found: ID does not exist" Nov 22 09:30:28 crc kubenswrapper[4846]: I1122 09:30:28.898965 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:28 crc kubenswrapper[4846]: E1122 09:30:28.899448 4846 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 09:30:28 crc kubenswrapper[4846]: E1122 09:30:28.899533 4846 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 09:30:28 crc kubenswrapper[4846]: E1122 09:30:28.899656 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift podName:575c6d2b-ae18-48ec-a314-211ccd078d87 nodeName:}" failed. No retries permitted until 2025-11-22 09:30:30.899614649 +0000 UTC m=+1005.835304358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift") pod "swift-storage-0" (UID: "575c6d2b-ae18-48ec-a314-211ccd078d87") : configmap "swift-ring-files" not found Nov 22 09:30:29 crc kubenswrapper[4846]: I1122 09:30:29.786368 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 22 09:30:29 crc kubenswrapper[4846]: I1122 09:30:29.906116 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 22 09:30:30 crc kubenswrapper[4846]: I1122 09:30:30.049517 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" path="/var/lib/kubelet/pods/5ee1c219-1f23-4236-a364-84d9dc62d9a5/volumes" Nov 22 09:30:30 crc kubenswrapper[4846]: I1122 09:30:30.936705 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:30 crc kubenswrapper[4846]: E1122 09:30:30.936993 4846 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 09:30:30 crc kubenswrapper[4846]: E1122 09:30:30.937037 4846 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 09:30:30 crc kubenswrapper[4846]: E1122 09:30:30.937132 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift podName:575c6d2b-ae18-48ec-a314-211ccd078d87 nodeName:}" failed. No retries permitted until 2025-11-22 09:30:34.937107681 +0000 UTC m=+1009.872797340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift") pod "swift-storage-0" (UID: "575c6d2b-ae18-48ec-a314-211ccd078d87") : configmap "swift-ring-files" not found Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.172558 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hgdvt"] Nov 22 09:30:31 crc kubenswrapper[4846]: E1122 09:30:31.173689 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerName="init" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.173716 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerName="init" Nov 22 09:30:31 crc kubenswrapper[4846]: E1122 09:30:31.173734 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerName="dnsmasq-dns" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.173743 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerName="dnsmasq-dns" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.174009 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee1c219-1f23-4236-a364-84d9dc62d9a5" containerName="dnsmasq-dns" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.174907 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.178373 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.178649 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.178925 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.201263 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hgdvt"] Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241275 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-combined-ca-bundle\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241346 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-dispersionconf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241389 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8xf\" (UniqueName: \"kubernetes.io/projected/6f537097-bfac-4915-833f-ee9a52e7d8a5-kube-api-access-4t8xf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241416 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-scripts\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241440 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-ring-data-devices\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241495 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f537097-bfac-4915-833f-ee9a52e7d8a5-etc-swift\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.241524 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-swiftconf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.343700 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-dispersionconf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.343825 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8xf\" (UniqueName: \"kubernetes.io/projected/6f537097-bfac-4915-833f-ee9a52e7d8a5-kube-api-access-4t8xf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.343863 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-scripts\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.343899 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-ring-data-devices\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.344004 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f537097-bfac-4915-833f-ee9a52e7d8a5-etc-swift\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.344036 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-swiftconf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.344157 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-combined-ca-bundle\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.345205 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f537097-bfac-4915-833f-ee9a52e7d8a5-etc-swift\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.345474 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-ring-data-devices\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.345512 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-scripts\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.353329 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-swiftconf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.354327 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-dispersionconf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.359214 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-combined-ca-bundle\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.365981 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8xf\" (UniqueName: \"kubernetes.io/projected/6f537097-bfac-4915-833f-ee9a52e7d8a5-kube-api-access-4t8xf\") pod \"swift-ring-rebalance-hgdvt\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:31 crc kubenswrapper[4846]: I1122 09:30:31.497871 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:32 crc kubenswrapper[4846]: W1122 09:30:32.057354 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f537097_bfac_4915_833f_ee9a52e7d8a5.slice/crio-fdec4581f9d37e21ffaca68011a2be1ca7dfed132f1f7ae23acbe75ed2d2a60b WatchSource:0}: Error finding container fdec4581f9d37e21ffaca68011a2be1ca7dfed132f1f7ae23acbe75ed2d2a60b: Status 404 returned error can't find the container with id fdec4581f9d37e21ffaca68011a2be1ca7dfed132f1f7ae23acbe75ed2d2a60b Nov 22 09:30:32 crc kubenswrapper[4846]: I1122 09:30:32.060021 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hgdvt"] Nov 22 09:30:32 crc kubenswrapper[4846]: I1122 09:30:32.269165 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 22 09:30:32 crc kubenswrapper[4846]: I1122 09:30:32.269397 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 22 09:30:32 crc kubenswrapper[4846]: I1122 09:30:32.380629 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 22 09:30:32 crc kubenswrapper[4846]: I1122 09:30:32.753386 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hgdvt" event={"ID":"6f537097-bfac-4915-833f-ee9a52e7d8a5","Type":"ContainerStarted","Data":"fdec4581f9d37e21ffaca68011a2be1ca7dfed132f1f7ae23acbe75ed2d2a60b"} Nov 22 09:30:32 crc kubenswrapper[4846]: I1122 09:30:32.858825 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.633164 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dfac-account-create-jx6r6"] Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.634627 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.638257 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.642410 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dfac-account-create-jx6r6"] Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.689034 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-scf94"] Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.696985 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.714012 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-scf94"] Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.799847 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mk8\" (UniqueName: \"kubernetes.io/projected/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-kube-api-access-59mk8\") pod \"keystone-dfac-account-create-jx6r6\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.807285 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-operator-scripts\") pod \"keystone-dfac-account-create-jx6r6\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.807500 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzg4\" (UniqueName: \"kubernetes.io/projected/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-kube-api-access-bhzg4\") pod \"keystone-db-create-scf94\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.808123 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-operator-scripts\") pod \"keystone-db-create-scf94\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.912238 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-operator-scripts\") pod \"keystone-db-create-scf94\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.912393 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mk8\" (UniqueName: \"kubernetes.io/projected/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-kube-api-access-59mk8\") pod \"keystone-dfac-account-create-jx6r6\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.912446 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-operator-scripts\") pod \"keystone-dfac-account-create-jx6r6\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.912553 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzg4\" (UniqueName: \"kubernetes.io/projected/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-kube-api-access-bhzg4\") pod \"keystone-db-create-scf94\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.913845 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-operator-scripts\") pod \"keystone-dfac-account-create-jx6r6\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.914364 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-operator-scripts\") pod \"keystone-db-create-scf94\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.930795 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-l2vdd"] Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.932087 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.939241 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-l2vdd"] Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.958414 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzg4\" (UniqueName: \"kubernetes.io/projected/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-kube-api-access-bhzg4\") pod \"keystone-db-create-scf94\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " pod="openstack/keystone-db-create-scf94" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.959643 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mk8\" (UniqueName: \"kubernetes.io/projected/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-kube-api-access-59mk8\") pod \"keystone-dfac-account-create-jx6r6\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:33 crc kubenswrapper[4846]: I1122 09:30:33.976722 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.022076 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-scf94" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.033145 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-15ca-account-create-4jszs"] Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.034965 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.037654 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.059099 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-15ca-account-create-4jszs"] Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.119743 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mnb\" (UniqueName: \"kubernetes.io/projected/70ebb80d-1c77-4582-a075-78376fe2c7dd-kube-api-access-92mnb\") pod \"placement-db-create-l2vdd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.119835 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70ebb80d-1c77-4582-a075-78376fe2c7dd-operator-scripts\") pod \"placement-db-create-l2vdd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.176920 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-c9qdj"] Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.178607 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.196001 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c9qdj"] Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.221595 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7209d158-55d3-457e-a685-83d7a82fb290-operator-scripts\") pod \"placement-15ca-account-create-4jszs\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.221672 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70ebb80d-1c77-4582-a075-78376fe2c7dd-operator-scripts\") pod \"placement-db-create-l2vdd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.221710 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t2b9\" (UniqueName: \"kubernetes.io/projected/7209d158-55d3-457e-a685-83d7a82fb290-kube-api-access-7t2b9\") pod \"placement-15ca-account-create-4jszs\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.222352 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mnb\" (UniqueName: \"kubernetes.io/projected/70ebb80d-1c77-4582-a075-78376fe2c7dd-kube-api-access-92mnb\") pod \"placement-db-create-l2vdd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.222853 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70ebb80d-1c77-4582-a075-78376fe2c7dd-operator-scripts\") pod \"placement-db-create-l2vdd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.245324 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mnb\" (UniqueName: \"kubernetes.io/projected/70ebb80d-1c77-4582-a075-78376fe2c7dd-kube-api-access-92mnb\") pod \"placement-db-create-l2vdd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.323491 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7209d158-55d3-457e-a685-83d7a82fb290-operator-scripts\") pod \"placement-15ca-account-create-4jszs\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.323553 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t2b9\" (UniqueName: \"kubernetes.io/projected/7209d158-55d3-457e-a685-83d7a82fb290-kube-api-access-7t2b9\") pod \"placement-15ca-account-create-4jszs\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.323626 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7s8\" (UniqueName: \"kubernetes.io/projected/d0bc7919-add5-46fe-ab1b-26b7b3e114de-kube-api-access-px7s8\") pod \"glance-db-create-c9qdj\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.323658 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bc7919-add5-46fe-ab1b-26b7b3e114de-operator-scripts\") pod \"glance-db-create-c9qdj\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.324528 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7209d158-55d3-457e-a685-83d7a82fb290-operator-scripts\") pod \"placement-15ca-account-create-4jszs\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.335498 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4809-account-create-tsnms"] Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.338660 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.342149 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.346730 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4809-account-create-tsnms"] Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.354415 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t2b9\" (UniqueName: \"kubernetes.io/projected/7209d158-55d3-457e-a685-83d7a82fb290-kube-api-access-7t2b9\") pod \"placement-15ca-account-create-4jszs\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.378855 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.378873 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.425130 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bc7919-add5-46fe-ab1b-26b7b3e114de-operator-scripts\") pod \"glance-db-create-c9qdj\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.425412 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px7s8\" (UniqueName: \"kubernetes.io/projected/d0bc7919-add5-46fe-ab1b-26b7b3e114de-kube-api-access-px7s8\") pod \"glance-db-create-c9qdj\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.426364 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bc7919-add5-46fe-ab1b-26b7b3e114de-operator-scripts\") pod \"glance-db-create-c9qdj\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.450909 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7s8\" (UniqueName: \"kubernetes.io/projected/d0bc7919-add5-46fe-ab1b-26b7b3e114de-kube-api-access-px7s8\") pod \"glance-db-create-c9qdj\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.498771 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.527728 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q4lw\" (UniqueName: \"kubernetes.io/projected/e3695d7a-eba8-4780-9995-e47e5989da34-kube-api-access-6q4lw\") pod \"glance-4809-account-create-tsnms\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.527989 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3695d7a-eba8-4780-9995-e47e5989da34-operator-scripts\") pod \"glance-4809-account-create-tsnms\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.630238 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3695d7a-eba8-4780-9995-e47e5989da34-operator-scripts\") pod \"glance-4809-account-create-tsnms\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.630356 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q4lw\" (UniqueName: \"kubernetes.io/projected/e3695d7a-eba8-4780-9995-e47e5989da34-kube-api-access-6q4lw\") pod \"glance-4809-account-create-tsnms\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.631154 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3695d7a-eba8-4780-9995-e47e5989da34-operator-scripts\") pod \"glance-4809-account-create-tsnms\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.650001 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q4lw\" (UniqueName: \"kubernetes.io/projected/e3695d7a-eba8-4780-9995-e47e5989da34-kube-api-access-6q4lw\") pod \"glance-4809-account-create-tsnms\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:34 crc kubenswrapper[4846]: I1122 09:30:34.730092 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:35 crc kubenswrapper[4846]: E1122 09:30:35.058289 4846 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 09:30:35 crc kubenswrapper[4846]: E1122 09:30:35.058682 4846 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 09:30:35 crc kubenswrapper[4846]: E1122 09:30:35.058763 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift podName:575c6d2b-ae18-48ec-a314-211ccd078d87 nodeName:}" failed. No retries permitted until 2025-11-22 09:30:43.05873722 +0000 UTC m=+1017.994426869 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift") pod "swift-storage-0" (UID: "575c6d2b-ae18-48ec-a314-211ccd078d87") : configmap "swift-ring-files" not found Nov 22 09:30:35 crc kubenswrapper[4846]: I1122 09:30:35.039660 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.473589 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.560406 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g6cfm"] Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.561088 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerName="dnsmasq-dns" containerID="cri-o://7e184037fabe79602b84b12a8df261859ca6627813657c26cd59c192726cafdd" gracePeriod=10 Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.728607 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-scf94"] Nov 22 09:30:36 crc kubenswrapper[4846]: W1122 09:30:36.739840 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70ebb80d_1c77_4582_a075_78376fe2c7dd.slice/crio-3c82038693491f420c0d7a34a209d9c7fec908a92cee60618381a168ca9c0b17 WatchSource:0}: Error finding container 3c82038693491f420c0d7a34a209d9c7fec908a92cee60618381a168ca9c0b17: Status 404 returned error can't find the container with id 3c82038693491f420c0d7a34a209d9c7fec908a92cee60618381a168ca9c0b17 Nov 22 09:30:36 crc kubenswrapper[4846]: W1122 09:30:36.742661 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d63f9ef_2da3_4b25_8a16_b7bbeba5c2d5.slice/crio-ef61d94632649905a8553410691b5170bd17f985e91f8c23c2ab5c5ca1714738 WatchSource:0}: Error finding container ef61d94632649905a8553410691b5170bd17f985e91f8c23c2ab5c5ca1714738: Status 404 returned error can't find the container with id ef61d94632649905a8553410691b5170bd17f985e91f8c23c2ab5c5ca1714738 Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.743871 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-l2vdd"] Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.795909 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-scf94" event={"ID":"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5","Type":"ContainerStarted","Data":"ef61d94632649905a8553410691b5170bd17f985e91f8c23c2ab5c5ca1714738"} Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.798538 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hgdvt" event={"ID":"6f537097-bfac-4915-833f-ee9a52e7d8a5","Type":"ContainerStarted","Data":"d4386ba9097111d54bec85a5d066c56cf3f28fc16f42d09aa3eb80e5f90be442"} Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.802900 4846 generic.go:334] "Generic (PLEG): container finished" podID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerID="7e184037fabe79602b84b12a8df261859ca6627813657c26cd59c192726cafdd" exitCode=0 Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.802993 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" event={"ID":"ac74ecfd-8981-4682-847e-b8c23742bfd0","Type":"ContainerDied","Data":"7e184037fabe79602b84b12a8df261859ca6627813657c26cd59c192726cafdd"} Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.806381 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l2vdd" event={"ID":"70ebb80d-1c77-4582-a075-78376fe2c7dd","Type":"ContainerStarted","Data":"3c82038693491f420c0d7a34a209d9c7fec908a92cee60618381a168ca9c0b17"} Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.825757 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hgdvt" podStartSLOduration=1.6091774669999999 podStartE2EDuration="5.825731848s" podCreationTimestamp="2025-11-22 09:30:31 +0000 UTC" firstStartedPulling="2025-11-22 09:30:32.060845682 +0000 UTC m=+1006.996535331" lastFinishedPulling="2025-11-22 09:30:36.277400063 +0000 UTC m=+1011.213089712" observedRunningTime="2025-11-22 09:30:36.81863077 +0000 UTC m=+1011.754320419" watchObservedRunningTime="2025-11-22 09:30:36.825731848 +0000 UTC m=+1011.761421497" Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.904272 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-15ca-account-create-4jszs"] Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.918074 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c9qdj"] Nov 22 09:30:36 crc kubenswrapper[4846]: W1122 09:30:36.923732 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7209d158_55d3_457e_a685_83d7a82fb290.slice/crio-32bb606f256969aba0fd627205cd3834d48cc2d1e3eeed7db07a90d1443d8f28 WatchSource:0}: Error finding container 32bb606f256969aba0fd627205cd3834d48cc2d1e3eeed7db07a90d1443d8f28: Status 404 returned error can't find the container with id 32bb606f256969aba0fd627205cd3834d48cc2d1e3eeed7db07a90d1443d8f28 Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.929729 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4809-account-create-tsnms"] Nov 22 09:30:36 crc kubenswrapper[4846]: I1122 09:30:36.933450 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dfac-account-create-jx6r6"] Nov 22 09:30:36 crc kubenswrapper[4846]: W1122 09:30:36.940664 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeddb06eb_36f7_48ba_acbc_b2129ca2b43d.slice/crio-27473ec38235649f1dfddae281ef783b29503b9a2d20611d2d3eb6ba8d069f61 WatchSource:0}: Error finding container 27473ec38235649f1dfddae281ef783b29503b9a2d20611d2d3eb6ba8d069f61: Status 404 returned error can't find the container with id 27473ec38235649f1dfddae281ef783b29503b9a2d20611d2d3eb6ba8d069f61 Nov 22 09:30:36 crc kubenswrapper[4846]: W1122 09:30:36.942835 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3695d7a_eba8_4780_9995_e47e5989da34.slice/crio-be32a8f9c3549a82f151f5f3ff57fdb21ffb698d745b13a7e7283700bc8e8bf8 WatchSource:0}: Error finding container be32a8f9c3549a82f151f5f3ff57fdb21ffb698d745b13a7e7283700bc8e8bf8: Status 404 returned error can't find the container with id be32a8f9c3549a82f151f5f3ff57fdb21ffb698d745b13a7e7283700bc8e8bf8 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.079751 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.118613 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86bfz\" (UniqueName: \"kubernetes.io/projected/ac74ecfd-8981-4682-847e-b8c23742bfd0-kube-api-access-86bfz\") pod \"ac74ecfd-8981-4682-847e-b8c23742bfd0\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.118763 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-config\") pod \"ac74ecfd-8981-4682-847e-b8c23742bfd0\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.118936 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-dns-svc\") pod \"ac74ecfd-8981-4682-847e-b8c23742bfd0\" (UID: \"ac74ecfd-8981-4682-847e-b8c23742bfd0\") " Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.130422 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac74ecfd-8981-4682-847e-b8c23742bfd0-kube-api-access-86bfz" (OuterVolumeSpecName: "kube-api-access-86bfz") pod "ac74ecfd-8981-4682-847e-b8c23742bfd0" (UID: "ac74ecfd-8981-4682-847e-b8c23742bfd0"). InnerVolumeSpecName "kube-api-access-86bfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.220967 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86bfz\" (UniqueName: \"kubernetes.io/projected/ac74ecfd-8981-4682-847e-b8c23742bfd0-kube-api-access-86bfz\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.264095 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac74ecfd-8981-4682-847e-b8c23742bfd0" (UID: "ac74ecfd-8981-4682-847e-b8c23742bfd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.264942 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-config" (OuterVolumeSpecName: "config") pod "ac74ecfd-8981-4682-847e-b8c23742bfd0" (UID: "ac74ecfd-8981-4682-847e-b8c23742bfd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.323212 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.323269 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac74ecfd-8981-4682-847e-b8c23742bfd0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.833107 4846 generic.go:334] "Generic (PLEG): container finished" podID="5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" containerID="32497e82c613baa575e2095e88dfd21377d9256d60c37c294b6386179b552051" exitCode=0 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.833211 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-scf94" event={"ID":"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5","Type":"ContainerDied","Data":"32497e82c613baa575e2095e88dfd21377d9256d60c37c294b6386179b552051"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.838697 4846 generic.go:334] "Generic (PLEG): container finished" podID="d0bc7919-add5-46fe-ab1b-26b7b3e114de" containerID="5b1dbd0c28487979c863dcb19ba83b7ddd62b6a7b5fea1a2c8d8ad769274c537" exitCode=0 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.838863 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c9qdj" event={"ID":"d0bc7919-add5-46fe-ab1b-26b7b3e114de","Type":"ContainerDied","Data":"5b1dbd0c28487979c863dcb19ba83b7ddd62b6a7b5fea1a2c8d8ad769274c537"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.838962 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c9qdj" event={"ID":"d0bc7919-add5-46fe-ab1b-26b7b3e114de","Type":"ContainerStarted","Data":"c0522687a9f957fcf329ec68e84a1ebc37525c87b17ded9675cbfaee49a224f3"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.841563 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" event={"ID":"ac74ecfd-8981-4682-847e-b8c23742bfd0","Type":"ContainerDied","Data":"de99244457e309e9caa068939e95fb0f1d687ab83d4ba075c44510b1a8d77e2b"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.841737 4846 scope.go:117] "RemoveContainer" containerID="7e184037fabe79602b84b12a8df261859ca6627813657c26cd59c192726cafdd" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.842098 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-g6cfm" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.845100 4846 generic.go:334] "Generic (PLEG): container finished" podID="eddb06eb-36f7-48ba-acbc-b2129ca2b43d" containerID="92a9703249142e8c1f9cda32fa8b94bbe087b4af2bca45f48d837c10559628ba" exitCode=0 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.845190 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfac-account-create-jx6r6" event={"ID":"eddb06eb-36f7-48ba-acbc-b2129ca2b43d","Type":"ContainerDied","Data":"92a9703249142e8c1f9cda32fa8b94bbe087b4af2bca45f48d837c10559628ba"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.845224 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfac-account-create-jx6r6" event={"ID":"eddb06eb-36f7-48ba-acbc-b2129ca2b43d","Type":"ContainerStarted","Data":"27473ec38235649f1dfddae281ef783b29503b9a2d20611d2d3eb6ba8d069f61"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.853404 4846 generic.go:334] "Generic (PLEG): container finished" podID="70ebb80d-1c77-4582-a075-78376fe2c7dd" containerID="86d3518959e308545599333a51e77cd235a28178a8e9f68826241bf425a0a446" exitCode=0 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.853538 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l2vdd" event={"ID":"70ebb80d-1c77-4582-a075-78376fe2c7dd","Type":"ContainerDied","Data":"86d3518959e308545599333a51e77cd235a28178a8e9f68826241bf425a0a446"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.857847 4846 generic.go:334] "Generic (PLEG): container finished" podID="7209d158-55d3-457e-a685-83d7a82fb290" containerID="f39cac436f676610a98305350fcf4bccf58087cdd93c19debe4295b147e4fa20" exitCode=0 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.857926 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-15ca-account-create-4jszs" event={"ID":"7209d158-55d3-457e-a685-83d7a82fb290","Type":"ContainerDied","Data":"f39cac436f676610a98305350fcf4bccf58087cdd93c19debe4295b147e4fa20"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.857956 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-15ca-account-create-4jszs" event={"ID":"7209d158-55d3-457e-a685-83d7a82fb290","Type":"ContainerStarted","Data":"32bb606f256969aba0fd627205cd3834d48cc2d1e3eeed7db07a90d1443d8f28"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.867938 4846 generic.go:334] "Generic (PLEG): container finished" podID="e3695d7a-eba8-4780-9995-e47e5989da34" containerID="6a99b492d417f7aa9dbf48002719dc4cbc9c84d5be47aee1b141fc93d6319fbc" exitCode=0 Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.868430 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4809-account-create-tsnms" event={"ID":"e3695d7a-eba8-4780-9995-e47e5989da34","Type":"ContainerDied","Data":"6a99b492d417f7aa9dbf48002719dc4cbc9c84d5be47aee1b141fc93d6319fbc"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.868910 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4809-account-create-tsnms" event={"ID":"e3695d7a-eba8-4780-9995-e47e5989da34","Type":"ContainerStarted","Data":"be32a8f9c3549a82f151f5f3ff57fdb21ffb698d745b13a7e7283700bc8e8bf8"} Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.889939 4846 scope.go:117] "RemoveContainer" containerID="858662d87849c597693415f31b4a0bad265d4a926f02c1da0f689e2f7b6104e9" Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.943145 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g6cfm"] Nov 22 09:30:37 crc kubenswrapper[4846]: I1122 09:30:37.952430 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-g6cfm"] Nov 22 09:30:38 crc kubenswrapper[4846]: I1122 09:30:38.069020 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" path="/var/lib/kubelet/pods/ac74ecfd-8981-4682-847e-b8c23742bfd0/volumes" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.229935 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.356876 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.472727 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70ebb80d-1c77-4582-a075-78376fe2c7dd-operator-scripts\") pod \"70ebb80d-1c77-4582-a075-78376fe2c7dd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.472882 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92mnb\" (UniqueName: \"kubernetes.io/projected/70ebb80d-1c77-4582-a075-78376fe2c7dd-kube-api-access-92mnb\") pod \"70ebb80d-1c77-4582-a075-78376fe2c7dd\" (UID: \"70ebb80d-1c77-4582-a075-78376fe2c7dd\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.474645 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70ebb80d-1c77-4582-a075-78376fe2c7dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70ebb80d-1c77-4582-a075-78376fe2c7dd" (UID: "70ebb80d-1c77-4582-a075-78376fe2c7dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.479875 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ebb80d-1c77-4582-a075-78376fe2c7dd-kube-api-access-92mnb" (OuterVolumeSpecName: "kube-api-access-92mnb") pod "70ebb80d-1c77-4582-a075-78376fe2c7dd" (UID: "70ebb80d-1c77-4582-a075-78376fe2c7dd"). InnerVolumeSpecName "kube-api-access-92mnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.574938 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70ebb80d-1c77-4582-a075-78376fe2c7dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.574977 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92mnb\" (UniqueName: \"kubernetes.io/projected/70ebb80d-1c77-4582-a075-78376fe2c7dd-kube-api-access-92mnb\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.653058 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.657700 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.664318 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-scf94" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.675755 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t2b9\" (UniqueName: \"kubernetes.io/projected/7209d158-55d3-457e-a685-83d7a82fb290-kube-api-access-7t2b9\") pod \"7209d158-55d3-457e-a685-83d7a82fb290\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.675853 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-operator-scripts\") pod \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.675939 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7209d158-55d3-457e-a685-83d7a82fb290-operator-scripts\") pod \"7209d158-55d3-457e-a685-83d7a82fb290\" (UID: \"7209d158-55d3-457e-a685-83d7a82fb290\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.675996 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mk8\" (UniqueName: \"kubernetes.io/projected/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-kube-api-access-59mk8\") pod \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\" (UID: \"eddb06eb-36f7-48ba-acbc-b2129ca2b43d\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.676456 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eddb06eb-36f7-48ba-acbc-b2129ca2b43d" (UID: "eddb06eb-36f7-48ba-acbc-b2129ca2b43d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.676654 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7209d158-55d3-457e-a685-83d7a82fb290-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7209d158-55d3-457e-a685-83d7a82fb290" (UID: "7209d158-55d3-457e-a685-83d7a82fb290"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.679540 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7209d158-55d3-457e-a685-83d7a82fb290-kube-api-access-7t2b9" (OuterVolumeSpecName: "kube-api-access-7t2b9") pod "7209d158-55d3-457e-a685-83d7a82fb290" (UID: "7209d158-55d3-457e-a685-83d7a82fb290"). InnerVolumeSpecName "kube-api-access-7t2b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.682076 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-kube-api-access-59mk8" (OuterVolumeSpecName: "kube-api-access-59mk8") pod "eddb06eb-36f7-48ba-acbc-b2129ca2b43d" (UID: "eddb06eb-36f7-48ba-acbc-b2129ca2b43d"). InnerVolumeSpecName "kube-api-access-59mk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.683692 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.684290 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.777439 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7s8\" (UniqueName: \"kubernetes.io/projected/d0bc7919-add5-46fe-ab1b-26b7b3e114de-kube-api-access-px7s8\") pod \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.777518 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bc7919-add5-46fe-ab1b-26b7b3e114de-operator-scripts\") pod \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\" (UID: \"d0bc7919-add5-46fe-ab1b-26b7b3e114de\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.777611 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhzg4\" (UniqueName: \"kubernetes.io/projected/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-kube-api-access-bhzg4\") pod \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.777633 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-operator-scripts\") pod \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\" (UID: \"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.777708 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q4lw\" (UniqueName: \"kubernetes.io/projected/e3695d7a-eba8-4780-9995-e47e5989da34-kube-api-access-6q4lw\") pod \"e3695d7a-eba8-4780-9995-e47e5989da34\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.777728 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3695d7a-eba8-4780-9995-e47e5989da34-operator-scripts\") pod \"e3695d7a-eba8-4780-9995-e47e5989da34\" (UID: \"e3695d7a-eba8-4780-9995-e47e5989da34\") " Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778302 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t2b9\" (UniqueName: \"kubernetes.io/projected/7209d158-55d3-457e-a685-83d7a82fb290-kube-api-access-7t2b9\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778322 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778333 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7209d158-55d3-457e-a685-83d7a82fb290-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778344 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mk8\" (UniqueName: \"kubernetes.io/projected/eddb06eb-36f7-48ba-acbc-b2129ca2b43d-kube-api-access-59mk8\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778571 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bc7919-add5-46fe-ab1b-26b7b3e114de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0bc7919-add5-46fe-ab1b-26b7b3e114de" (UID: "d0bc7919-add5-46fe-ab1b-26b7b3e114de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778597 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" (UID: "5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.778754 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3695d7a-eba8-4780-9995-e47e5989da34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3695d7a-eba8-4780-9995-e47e5989da34" (UID: "e3695d7a-eba8-4780-9995-e47e5989da34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.782917 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bc7919-add5-46fe-ab1b-26b7b3e114de-kube-api-access-px7s8" (OuterVolumeSpecName: "kube-api-access-px7s8") pod "d0bc7919-add5-46fe-ab1b-26b7b3e114de" (UID: "d0bc7919-add5-46fe-ab1b-26b7b3e114de"). InnerVolumeSpecName "kube-api-access-px7s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.783645 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3695d7a-eba8-4780-9995-e47e5989da34-kube-api-access-6q4lw" (OuterVolumeSpecName: "kube-api-access-6q4lw") pod "e3695d7a-eba8-4780-9995-e47e5989da34" (UID: "e3695d7a-eba8-4780-9995-e47e5989da34"). InnerVolumeSpecName "kube-api-access-6q4lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.783894 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-kube-api-access-bhzg4" (OuterVolumeSpecName: "kube-api-access-bhzg4") pod "5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" (UID: "5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5"). InnerVolumeSpecName "kube-api-access-bhzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.880379 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px7s8\" (UniqueName: \"kubernetes.io/projected/d0bc7919-add5-46fe-ab1b-26b7b3e114de-kube-api-access-px7s8\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.880421 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0bc7919-add5-46fe-ab1b-26b7b3e114de-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.880435 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhzg4\" (UniqueName: \"kubernetes.io/projected/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-kube-api-access-bhzg4\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.880446 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.880458 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3695d7a-eba8-4780-9995-e47e5989da34-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.880469 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q4lw\" (UniqueName: \"kubernetes.io/projected/e3695d7a-eba8-4780-9995-e47e5989da34-kube-api-access-6q4lw\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.902335 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-15ca-account-create-4jszs" event={"ID":"7209d158-55d3-457e-a685-83d7a82fb290","Type":"ContainerDied","Data":"32bb606f256969aba0fd627205cd3834d48cc2d1e3eeed7db07a90d1443d8f28"} Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.902383 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32bb606f256969aba0fd627205cd3834d48cc2d1e3eeed7db07a90d1443d8f28" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.902448 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-15ca-account-create-4jszs" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.905144 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4809-account-create-tsnms" event={"ID":"e3695d7a-eba8-4780-9995-e47e5989da34","Type":"ContainerDied","Data":"be32a8f9c3549a82f151f5f3ff57fdb21ffb698d745b13a7e7283700bc8e8bf8"} Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.905382 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be32a8f9c3549a82f151f5f3ff57fdb21ffb698d745b13a7e7283700bc8e8bf8" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.905479 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4809-account-create-tsnms" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.924264 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-scf94" event={"ID":"5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5","Type":"ContainerDied","Data":"ef61d94632649905a8553410691b5170bd17f985e91f8c23c2ab5c5ca1714738"} Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.924643 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-scf94" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.924652 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef61d94632649905a8553410691b5170bd17f985e91f8c23c2ab5c5ca1714738" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.931524 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c9qdj" event={"ID":"d0bc7919-add5-46fe-ab1b-26b7b3e114de","Type":"ContainerDied","Data":"c0522687a9f957fcf329ec68e84a1ebc37525c87b17ded9675cbfaee49a224f3"} Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.931557 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c9qdj" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.931563 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0522687a9f957fcf329ec68e84a1ebc37525c87b17ded9675cbfaee49a224f3" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.934467 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dfac-account-create-jx6r6" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.934690 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dfac-account-create-jx6r6" event={"ID":"eddb06eb-36f7-48ba-acbc-b2129ca2b43d","Type":"ContainerDied","Data":"27473ec38235649f1dfddae281ef783b29503b9a2d20611d2d3eb6ba8d069f61"} Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.934714 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27473ec38235649f1dfddae281ef783b29503b9a2d20611d2d3eb6ba8d069f61" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.938232 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l2vdd" event={"ID":"70ebb80d-1c77-4582-a075-78376fe2c7dd","Type":"ContainerDied","Data":"3c82038693491f420c0d7a34a209d9c7fec908a92cee60618381a168ca9c0b17"} Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.938266 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c82038693491f420c0d7a34a209d9c7fec908a92cee60618381a168ca9c0b17" Nov 22 09:30:39 crc kubenswrapper[4846]: I1122 09:30:39.938329 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l2vdd" Nov 22 09:30:42 crc kubenswrapper[4846]: I1122 09:30:42.970929 4846 generic.go:334] "Generic (PLEG): container finished" podID="899cf49d-9541-4f23-b1a2-887324973fb1" containerID="56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37" exitCode=0 Nov 22 09:30:42 crc kubenswrapper[4846]: I1122 09:30:42.971058 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"899cf49d-9541-4f23-b1a2-887324973fb1","Type":"ContainerDied","Data":"56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37"} Nov 22 09:30:42 crc kubenswrapper[4846]: I1122 09:30:42.973284 4846 generic.go:334] "Generic (PLEG): container finished" podID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerID="fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7" exitCode=0 Nov 22 09:30:42 crc kubenswrapper[4846]: I1122 09:30:42.973318 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6","Type":"ContainerDied","Data":"fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7"} Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.148429 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:43 crc kubenswrapper[4846]: E1122 09:30:43.148769 4846 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 09:30:43 crc kubenswrapper[4846]: E1122 09:30:43.148941 4846 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 09:30:43 crc kubenswrapper[4846]: E1122 09:30:43.149002 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift podName:575c6d2b-ae18-48ec-a314-211ccd078d87 nodeName:}" failed. No retries permitted until 2025-11-22 09:30:59.148981018 +0000 UTC m=+1034.084670667 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift") pod "swift-storage-0" (UID: "575c6d2b-ae18-48ec-a314-211ccd078d87") : configmap "swift-ring-files" not found Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.983693 4846 generic.go:334] "Generic (PLEG): container finished" podID="6f537097-bfac-4915-833f-ee9a52e7d8a5" containerID="d4386ba9097111d54bec85a5d066c56cf3f28fc16f42d09aa3eb80e5f90be442" exitCode=0 Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.983775 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hgdvt" event={"ID":"6f537097-bfac-4915-833f-ee9a52e7d8a5","Type":"ContainerDied","Data":"d4386ba9097111d54bec85a5d066c56cf3f28fc16f42d09aa3eb80e5f90be442"} Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.986640 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"899cf49d-9541-4f23-b1a2-887324973fb1","Type":"ContainerStarted","Data":"81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad"} Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.986862 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.989969 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6","Type":"ContainerStarted","Data":"1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d"} Nov 22 09:30:43 crc kubenswrapper[4846]: I1122 09:30:43.990344 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.027812 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.551828421 podStartE2EDuration="55.027790264s" podCreationTimestamp="2025-11-22 09:29:49 +0000 UTC" firstStartedPulling="2025-11-22 09:29:57.051413264 +0000 UTC m=+971.987102923" lastFinishedPulling="2025-11-22 09:30:06.527375117 +0000 UTC m=+981.463064766" observedRunningTime="2025-11-22 09:30:44.026062864 +0000 UTC m=+1018.961752513" watchObservedRunningTime="2025-11-22 09:30:44.027790264 +0000 UTC m=+1018.963479913" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.054427 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.620193285 podStartE2EDuration="55.054400154s" podCreationTimestamp="2025-11-22 09:29:49 +0000 UTC" firstStartedPulling="2025-11-22 09:29:57.053897317 +0000 UTC m=+971.989586956" lastFinishedPulling="2025-11-22 09:30:06.488104176 +0000 UTC m=+981.423793825" observedRunningTime="2025-11-22 09:30:44.047660836 +0000 UTC m=+1018.983350485" watchObservedRunningTime="2025-11-22 09:30:44.054400154 +0000 UTC m=+1018.990089803" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.419703 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sdgn9"] Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420437 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ebb80d-1c77-4582-a075-78376fe2c7dd" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420452 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ebb80d-1c77-4582-a075-78376fe2c7dd" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420469 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerName="init" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420475 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerName="init" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420489 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3695d7a-eba8-4780-9995-e47e5989da34" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420495 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3695d7a-eba8-4780-9995-e47e5989da34" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420507 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420514 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420524 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bc7919-add5-46fe-ab1b-26b7b3e114de" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420532 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bc7919-add5-46fe-ab1b-26b7b3e114de" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420548 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7209d158-55d3-457e-a685-83d7a82fb290" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420563 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7209d158-55d3-457e-a685-83d7a82fb290" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420575 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eddb06eb-36f7-48ba-acbc-b2129ca2b43d" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420581 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="eddb06eb-36f7-48ba-acbc-b2129ca2b43d" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: E1122 09:30:44.420597 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerName="dnsmasq-dns" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420602 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerName="dnsmasq-dns" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420786 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac74ecfd-8981-4682-847e-b8c23742bfd0" containerName="dnsmasq-dns" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420810 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bc7919-add5-46fe-ab1b-26b7b3e114de" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420828 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3695d7a-eba8-4780-9995-e47e5989da34" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420836 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420852 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="7209d158-55d3-457e-a685-83d7a82fb290" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420861 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="eddb06eb-36f7-48ba-acbc-b2129ca2b43d" containerName="mariadb-account-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.420884 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ebb80d-1c77-4582-a075-78376fe2c7dd" containerName="mariadb-database-create" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.421488 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.423939 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.424158 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vfpx8" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.436616 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sdgn9"] Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.578222 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-db-sync-config-data\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.578491 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjg4p\" (UniqueName: \"kubernetes.io/projected/bf68f3c4-7d31-4738-8a2f-97e24d184a29-kube-api-access-gjg4p\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.579111 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-combined-ca-bundle\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.579203 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-config-data\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.681199 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-db-sync-config-data\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.681312 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg4p\" (UniqueName: \"kubernetes.io/projected/bf68f3c4-7d31-4738-8a2f-97e24d184a29-kube-api-access-gjg4p\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.681417 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-combined-ca-bundle\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.681446 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-config-data\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.687991 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-db-sync-config-data\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.688740 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-combined-ca-bundle\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.700863 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-config-data\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.711493 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjg4p\" (UniqueName: \"kubernetes.io/projected/bf68f3c4-7d31-4738-8a2f-97e24d184a29-kube-api-access-gjg4p\") pod \"glance-db-sync-sdgn9\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:44 crc kubenswrapper[4846]: I1122 09:30:44.769910 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sdgn9" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.220340 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sdgn9"] Nov 22 09:30:45 crc kubenswrapper[4846]: W1122 09:30:45.222318 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf68f3c4_7d31_4738_8a2f_97e24d184a29.slice/crio-2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d WatchSource:0}: Error finding container 2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d: Status 404 returned error can't find the container with id 2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.330496 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499312 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-dispersionconf\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499364 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-scripts\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499437 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t8xf\" (UniqueName: \"kubernetes.io/projected/6f537097-bfac-4915-833f-ee9a52e7d8a5-kube-api-access-4t8xf\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499702 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-combined-ca-bundle\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499742 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f537097-bfac-4915-833f-ee9a52e7d8a5-etc-swift\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499816 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-ring-data-devices\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.499841 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-swiftconf\") pod \"6f537097-bfac-4915-833f-ee9a52e7d8a5\" (UID: \"6f537097-bfac-4915-833f-ee9a52e7d8a5\") " Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.500521 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.500876 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f537097-bfac-4915-833f-ee9a52e7d8a5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.506945 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f537097-bfac-4915-833f-ee9a52e7d8a5-kube-api-access-4t8xf" (OuterVolumeSpecName: "kube-api-access-4t8xf") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "kube-api-access-4t8xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.510467 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.522781 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-scripts" (OuterVolumeSpecName: "scripts") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.530934 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.533587 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6f537097-bfac-4915-833f-ee9a52e7d8a5" (UID: "6f537097-bfac-4915-833f-ee9a52e7d8a5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602236 4846 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f537097-bfac-4915-833f-ee9a52e7d8a5-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602275 4846 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602286 4846 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602296 4846 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602311 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f537097-bfac-4915-833f-ee9a52e7d8a5-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602321 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t8xf\" (UniqueName: \"kubernetes.io/projected/6f537097-bfac-4915-833f-ee9a52e7d8a5-kube-api-access-4t8xf\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:45 crc kubenswrapper[4846]: I1122 09:30:45.602329 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f537097-bfac-4915-833f-ee9a52e7d8a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:46 crc kubenswrapper[4846]: I1122 09:30:46.010760 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sdgn9" event={"ID":"bf68f3c4-7d31-4738-8a2f-97e24d184a29","Type":"ContainerStarted","Data":"2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d"} Nov 22 09:30:46 crc kubenswrapper[4846]: I1122 09:30:46.013904 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hgdvt" event={"ID":"6f537097-bfac-4915-833f-ee9a52e7d8a5","Type":"ContainerDied","Data":"fdec4581f9d37e21ffaca68011a2be1ca7dfed132f1f7ae23acbe75ed2d2a60b"} Nov 22 09:30:46 crc kubenswrapper[4846]: I1122 09:30:46.014143 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdec4581f9d37e21ffaca68011a2be1ca7dfed132f1f7ae23acbe75ed2d2a60b" Nov 22 09:30:46 crc kubenswrapper[4846]: I1122 09:30:46.014149 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hgdvt" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.112750 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-576fl" podUID="65c370a7-5d69-437a-98d2-810e97b9a5b7" containerName="ovn-controller" probeResult="failure" output=< Nov 22 09:30:49 crc kubenswrapper[4846]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 09:30:49 crc kubenswrapper[4846]: > Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.144348 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.163714 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bdxdm" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.400652 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-576fl-config-d94mg"] Nov 22 09:30:49 crc kubenswrapper[4846]: E1122 09:30:49.401562 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f537097-bfac-4915-833f-ee9a52e7d8a5" containerName="swift-ring-rebalance" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.401661 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f537097-bfac-4915-833f-ee9a52e7d8a5" containerName="swift-ring-rebalance" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.401933 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f537097-bfac-4915-833f-ee9a52e7d8a5" containerName="swift-ring-rebalance" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.402706 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.405502 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.419355 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-576fl-config-d94mg"] Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.482493 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run-ovn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.482966 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.482993 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-scripts\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.483036 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8btn\" (UniqueName: \"kubernetes.io/projected/de6ce110-038d-47df-a2d3-8bc2b86bb8be-kube-api-access-s8btn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.483146 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-log-ovn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.483192 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-additional-scripts\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.585472 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-log-ovn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.585548 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-additional-scripts\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.585583 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run-ovn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.585603 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.585621 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-scripts\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.585655 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8btn\" (UniqueName: \"kubernetes.io/projected/de6ce110-038d-47df-a2d3-8bc2b86bb8be-kube-api-access-s8btn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.586024 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run-ovn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.586102 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.586792 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-additional-scripts\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.591327 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-scripts\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.591590 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-log-ovn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.625380 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8btn\" (UniqueName: \"kubernetes.io/projected/de6ce110-038d-47df-a2d3-8bc2b86bb8be-kube-api-access-s8btn\") pod \"ovn-controller-576fl-config-d94mg\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:49 crc kubenswrapper[4846]: I1122 09:30:49.722599 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:50 crc kubenswrapper[4846]: I1122 09:30:50.058040 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-576fl-config-d94mg"] Nov 22 09:30:50 crc kubenswrapper[4846]: W1122 09:30:50.061493 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde6ce110_038d_47df_a2d3_8bc2b86bb8be.slice/crio-66628eee892e870f156825df54e79d869b27f3eac2f297d0fb798181438bfbc2 WatchSource:0}: Error finding container 66628eee892e870f156825df54e79d869b27f3eac2f297d0fb798181438bfbc2: Status 404 returned error can't find the container with id 66628eee892e870f156825df54e79d869b27f3eac2f297d0fb798181438bfbc2 Nov 22 09:30:51 crc kubenswrapper[4846]: I1122 09:30:51.077181 4846 generic.go:334] "Generic (PLEG): container finished" podID="de6ce110-038d-47df-a2d3-8bc2b86bb8be" containerID="a6b00086499f1c5f5e7ccc2f8cac2b93885368d78544599d9c6f1a835fe131fc" exitCode=0 Nov 22 09:30:51 crc kubenswrapper[4846]: I1122 09:30:51.077391 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-576fl-config-d94mg" event={"ID":"de6ce110-038d-47df-a2d3-8bc2b86bb8be","Type":"ContainerDied","Data":"a6b00086499f1c5f5e7ccc2f8cac2b93885368d78544599d9c6f1a835fe131fc"} Nov 22 09:30:51 crc kubenswrapper[4846]: I1122 09:30:51.077736 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-576fl-config-d94mg" event={"ID":"de6ce110-038d-47df-a2d3-8bc2b86bb8be","Type":"ContainerStarted","Data":"66628eee892e870f156825df54e79d869b27f3eac2f297d0fb798181438bfbc2"} Nov 22 09:30:54 crc kubenswrapper[4846]: I1122 09:30:54.111256 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-576fl" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.710004 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.885696 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run-ovn\") pod \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.885772 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-additional-scripts\") pod \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.885809 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-log-ovn\") pod \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.885825 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "de6ce110-038d-47df-a2d3-8bc2b86bb8be" (UID: "de6ce110-038d-47df-a2d3-8bc2b86bb8be"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.885855 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8btn\" (UniqueName: \"kubernetes.io/projected/de6ce110-038d-47df-a2d3-8bc2b86bb8be-kube-api-access-s8btn\") pod \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.885987 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run\") pod \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.886017 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "de6ce110-038d-47df-a2d3-8bc2b86bb8be" (UID: "de6ce110-038d-47df-a2d3-8bc2b86bb8be"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.886105 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run" (OuterVolumeSpecName: "var-run") pod "de6ce110-038d-47df-a2d3-8bc2b86bb8be" (UID: "de6ce110-038d-47df-a2d3-8bc2b86bb8be"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.886177 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-scripts\") pod \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\" (UID: \"de6ce110-038d-47df-a2d3-8bc2b86bb8be\") " Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.887211 4846 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.887235 4846 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.887249 4846 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de6ce110-038d-47df-a2d3-8bc2b86bb8be-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.887335 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "de6ce110-038d-47df-a2d3-8bc2b86bb8be" (UID: "de6ce110-038d-47df-a2d3-8bc2b86bb8be"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.887600 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-scripts" (OuterVolumeSpecName: "scripts") pod "de6ce110-038d-47df-a2d3-8bc2b86bb8be" (UID: "de6ce110-038d-47df-a2d3-8bc2b86bb8be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.892787 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6ce110-038d-47df-a2d3-8bc2b86bb8be-kube-api-access-s8btn" (OuterVolumeSpecName: "kube-api-access-s8btn") pod "de6ce110-038d-47df-a2d3-8bc2b86bb8be" (UID: "de6ce110-038d-47df-a2d3-8bc2b86bb8be"). InnerVolumeSpecName "kube-api-access-s8btn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.989496 4846 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.989532 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8btn\" (UniqueName: \"kubernetes.io/projected/de6ce110-038d-47df-a2d3-8bc2b86bb8be-kube-api-access-s8btn\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:58 crc kubenswrapper[4846]: I1122 09:30:58.989544 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de6ce110-038d-47df-a2d3-8bc2b86bb8be-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.162240 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-576fl-config-d94mg" event={"ID":"de6ce110-038d-47df-a2d3-8bc2b86bb8be","Type":"ContainerDied","Data":"66628eee892e870f156825df54e79d869b27f3eac2f297d0fb798181438bfbc2"} Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.162313 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66628eee892e870f156825df54e79d869b27f3eac2f297d0fb798181438bfbc2" Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.162945 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-576fl-config-d94mg" Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.193405 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.202071 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/575c6d2b-ae18-48ec-a314-211ccd078d87-etc-swift\") pod \"swift-storage-0\" (UID: \"575c6d2b-ae18-48ec-a314-211ccd078d87\") " pod="openstack/swift-storage-0" Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.407069 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.825864 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-576fl-config-d94mg"] Nov 22 09:30:59 crc kubenswrapper[4846]: I1122 09:30:59.832303 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-576fl-config-d94mg"] Nov 22 09:31:00 crc kubenswrapper[4846]: I1122 09:31:00.030768 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 22 09:31:00 crc kubenswrapper[4846]: W1122 09:31:00.035516 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575c6d2b_ae18_48ec_a314_211ccd078d87.slice/crio-953f88a664cc1668d06c641be1a7751ed509a60af47e9abaa56cf30546a96a65 WatchSource:0}: Error finding container 953f88a664cc1668d06c641be1a7751ed509a60af47e9abaa56cf30546a96a65: Status 404 returned error can't find the container with id 953f88a664cc1668d06c641be1a7751ed509a60af47e9abaa56cf30546a96a65 Nov 22 09:31:00 crc kubenswrapper[4846]: I1122 09:31:00.052929 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6ce110-038d-47df-a2d3-8bc2b86bb8be" path="/var/lib/kubelet/pods/de6ce110-038d-47df-a2d3-8bc2b86bb8be/volumes" Nov 22 09:31:00 crc kubenswrapper[4846]: I1122 09:31:00.172320 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sdgn9" event={"ID":"bf68f3c4-7d31-4738-8a2f-97e24d184a29","Type":"ContainerStarted","Data":"61733e54616f04475ee736b0b982cbe49c948a4518fb2dd424d1dae3f6529ba3"} Nov 22 09:31:00 crc kubenswrapper[4846]: I1122 09:31:00.174716 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"953f88a664cc1668d06c641be1a7751ed509a60af47e9abaa56cf30546a96a65"} Nov 22 09:31:00 crc kubenswrapper[4846]: I1122 09:31:00.197848 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sdgn9" podStartSLOduration=2.709426627 podStartE2EDuration="16.197827942s" podCreationTimestamp="2025-11-22 09:30:44 +0000 UTC" firstStartedPulling="2025-11-22 09:30:45.226573955 +0000 UTC m=+1020.162263604" lastFinishedPulling="2025-11-22 09:30:58.71497526 +0000 UTC m=+1033.650664919" observedRunningTime="2025-11-22 09:31:00.19127513 +0000 UTC m=+1035.126964789" watchObservedRunningTime="2025-11-22 09:31:00.197827942 +0000 UTC m=+1035.133517591" Nov 22 09:31:00 crc kubenswrapper[4846]: I1122 09:31:00.720326 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.071008 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-t2mkn"] Nov 22 09:31:01 crc kubenswrapper[4846]: E1122 09:31:01.071489 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6ce110-038d-47df-a2d3-8bc2b86bb8be" containerName="ovn-config" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.071506 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6ce110-038d-47df-a2d3-8bc2b86bb8be" containerName="ovn-config" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.071716 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6ce110-038d-47df-a2d3-8bc2b86bb8be" containerName="ovn-config" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.075524 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.103596 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-t2mkn"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.142330 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.190770 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8zbc6"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.195991 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.215313 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8zbc6"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.268336 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ed51-account-create-rvllg"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.270433 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.271396 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2ns\" (UniqueName: \"kubernetes.io/projected/1e293562-6940-4e74-90c4-a57ba16599ef-kube-api-access-fz2ns\") pod \"cinder-db-create-t2mkn\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.271636 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e293562-6940-4e74-90c4-a57ba16599ef-operator-scripts\") pod \"cinder-db-create-t2mkn\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.273991 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.276644 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ed51-account-create-rvllg"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.375698 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792e14dc-fdaa-4ea2-a71d-6bef55b41871-operator-scripts\") pod \"barbican-db-create-8zbc6\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.376350 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e293562-6940-4e74-90c4-a57ba16599ef-operator-scripts\") pod \"cinder-db-create-t2mkn\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.376413 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2ns\" (UniqueName: \"kubernetes.io/projected/1e293562-6940-4e74-90c4-a57ba16599ef-kube-api-access-fz2ns\") pod \"cinder-db-create-t2mkn\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.376449 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413f1d30-1d47-47b2-a954-91b8ed0134f3-operator-scripts\") pod \"barbican-ed51-account-create-rvllg\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.376472 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfff5\" (UniqueName: \"kubernetes.io/projected/413f1d30-1d47-47b2-a954-91b8ed0134f3-kube-api-access-xfff5\") pod \"barbican-ed51-account-create-rvllg\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.376531 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vw27\" (UniqueName: \"kubernetes.io/projected/792e14dc-fdaa-4ea2-a71d-6bef55b41871-kube-api-access-2vw27\") pod \"barbican-db-create-8zbc6\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.379958 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e293562-6940-4e74-90c4-a57ba16599ef-operator-scripts\") pod \"cinder-db-create-t2mkn\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.387356 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fa87-account-create-nn2fj"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.388724 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.393426 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.406740 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fa87-account-create-nn2fj"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.421880 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2ns\" (UniqueName: \"kubernetes.io/projected/1e293562-6940-4e74-90c4-a57ba16599ef-kube-api-access-fz2ns\") pod \"cinder-db-create-t2mkn\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.449207 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.481096 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mdpsq"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.482385 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.498920 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vw27\" (UniqueName: \"kubernetes.io/projected/792e14dc-fdaa-4ea2-a71d-6bef55b41871-kube-api-access-2vw27\") pod \"barbican-db-create-8zbc6\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.498967 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792e14dc-fdaa-4ea2-a71d-6bef55b41871-operator-scripts\") pod \"barbican-db-create-8zbc6\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.499063 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp786\" (UniqueName: \"kubernetes.io/projected/ba0e4040-2d4f-423f-8540-368a4c49bd74-kube-api-access-mp786\") pod \"cinder-fa87-account-create-nn2fj\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.499104 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cb86a-7325-4151-9b4c-b8af3060b82a-operator-scripts\") pod \"neutron-db-create-mdpsq\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.499129 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413f1d30-1d47-47b2-a954-91b8ed0134f3-operator-scripts\") pod \"barbican-ed51-account-create-rvllg\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.499150 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfff5\" (UniqueName: \"kubernetes.io/projected/413f1d30-1d47-47b2-a954-91b8ed0134f3-kube-api-access-xfff5\") pod \"barbican-ed51-account-create-rvllg\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.499190 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0e4040-2d4f-423f-8540-368a4c49bd74-operator-scripts\") pod \"cinder-fa87-account-create-nn2fj\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.499205 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95ml\" (UniqueName: \"kubernetes.io/projected/2b2cb86a-7325-4151-9b4c-b8af3060b82a-kube-api-access-w95ml\") pod \"neutron-db-create-mdpsq\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.501380 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792e14dc-fdaa-4ea2-a71d-6bef55b41871-operator-scripts\") pod \"barbican-db-create-8zbc6\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.501949 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413f1d30-1d47-47b2-a954-91b8ed0134f3-operator-scripts\") pod \"barbican-ed51-account-create-rvllg\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.526404 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4nttj"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.528950 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfff5\" (UniqueName: \"kubernetes.io/projected/413f1d30-1d47-47b2-a954-91b8ed0134f3-kube-api-access-xfff5\") pod \"barbican-ed51-account-create-rvllg\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.532812 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.556342 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vw27\" (UniqueName: \"kubernetes.io/projected/792e14dc-fdaa-4ea2-a71d-6bef55b41871-kube-api-access-2vw27\") pod \"barbican-db-create-8zbc6\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.556444 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.556610 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d54gt" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.556722 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.556871 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.558225 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mdpsq"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.586292 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4nttj"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600629 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0e4040-2d4f-423f-8540-368a4c49bd74-operator-scripts\") pod \"cinder-fa87-account-create-nn2fj\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600683 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95ml\" (UniqueName: \"kubernetes.io/projected/2b2cb86a-7325-4151-9b4c-b8af3060b82a-kube-api-access-w95ml\") pod \"neutron-db-create-mdpsq\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600781 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkx8j\" (UniqueName: \"kubernetes.io/projected/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-kube-api-access-nkx8j\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600805 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp786\" (UniqueName: \"kubernetes.io/projected/ba0e4040-2d4f-423f-8540-368a4c49bd74-kube-api-access-mp786\") pod \"cinder-fa87-account-create-nn2fj\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600835 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-config-data\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600853 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-combined-ca-bundle\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.600878 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cb86a-7325-4151-9b4c-b8af3060b82a-operator-scripts\") pod \"neutron-db-create-mdpsq\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.602723 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0e4040-2d4f-423f-8540-368a4c49bd74-operator-scripts\") pod \"cinder-fa87-account-create-nn2fj\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.605738 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cb86a-7325-4151-9b4c-b8af3060b82a-operator-scripts\") pod \"neutron-db-create-mdpsq\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.607034 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.625291 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp786\" (UniqueName: \"kubernetes.io/projected/ba0e4040-2d4f-423f-8540-368a4c49bd74-kube-api-access-mp786\") pod \"cinder-fa87-account-create-nn2fj\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.628100 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0e57-account-create-sbb6n"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.632294 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.642970 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.655259 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95ml\" (UniqueName: \"kubernetes.io/projected/2b2cb86a-7325-4151-9b4c-b8af3060b82a-kube-api-access-w95ml\") pod \"neutron-db-create-mdpsq\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.700567 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e57-account-create-sbb6n"] Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.703132 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-operator-scripts\") pod \"neutron-0e57-account-create-sbb6n\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.703184 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkx8j\" (UniqueName: \"kubernetes.io/projected/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-kube-api-access-nkx8j\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.703225 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-config-data\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.703244 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-combined-ca-bundle\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.703286 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftpq8\" (UniqueName: \"kubernetes.io/projected/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-kube-api-access-ftpq8\") pod \"neutron-0e57-account-create-sbb6n\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.708616 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-combined-ca-bundle\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.711448 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-config-data\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.714826 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.723424 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkx8j\" (UniqueName: \"kubernetes.io/projected/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-kube-api-access-nkx8j\") pod \"keystone-db-sync-4nttj\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.805442 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-operator-scripts\") pod \"neutron-0e57-account-create-sbb6n\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.805594 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftpq8\" (UniqueName: \"kubernetes.io/projected/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-kube-api-access-ftpq8\") pod \"neutron-0e57-account-create-sbb6n\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.806860 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-operator-scripts\") pod \"neutron-0e57-account-create-sbb6n\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.825201 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftpq8\" (UniqueName: \"kubernetes.io/projected/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-kube-api-access-ftpq8\") pod \"neutron-0e57-account-create-sbb6n\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.833214 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:01 crc kubenswrapper[4846]: I1122 09:31:01.916969 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.000777 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.015336 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.501240 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fa87-account-create-nn2fj"] Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.514150 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8zbc6"] Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.570100 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ed51-account-create-rvllg"] Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.595185 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-t2mkn"] Nov 22 09:31:02 crc kubenswrapper[4846]: W1122 09:31:02.606106 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod413f1d30_1d47_47b2_a954_91b8ed0134f3.slice/crio-7fd40b5bab4b790cb761754c5d374eb5bbb5259e04fd72a4cec128c883327db2 WatchSource:0}: Error finding container 7fd40b5bab4b790cb761754c5d374eb5bbb5259e04fd72a4cec128c883327db2: Status 404 returned error can't find the container with id 7fd40b5bab4b790cb761754c5d374eb5bbb5259e04fd72a4cec128c883327db2 Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.755223 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0e57-account-create-sbb6n"] Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.788423 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mdpsq"] Nov 22 09:31:02 crc kubenswrapper[4846]: I1122 09:31:02.805451 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4nttj"] Nov 22 09:31:02 crc kubenswrapper[4846]: W1122 09:31:02.841364 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b2cb86a_7325_4151_9b4c_b8af3060b82a.slice/crio-d1aecfe517d7cb6c15fa4cb84293a50a6ab9a527e5092dd568b3ba70f1cf1155 WatchSource:0}: Error finding container d1aecfe517d7cb6c15fa4cb84293a50a6ab9a527e5092dd568b3ba70f1cf1155: Status 404 returned error can't find the container with id d1aecfe517d7cb6c15fa4cb84293a50a6ab9a527e5092dd568b3ba70f1cf1155 Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.223714 4846 generic.go:334] "Generic (PLEG): container finished" podID="ba0e4040-2d4f-423f-8540-368a4c49bd74" containerID="9a32b28148a5a55c01240a63a00a3a5bc2ac21f11a4d8adf94ebf5b5bd17d091" exitCode=0 Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.224289 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fa87-account-create-nn2fj" event={"ID":"ba0e4040-2d4f-423f-8540-368a4c49bd74","Type":"ContainerDied","Data":"9a32b28148a5a55c01240a63a00a3a5bc2ac21f11a4d8adf94ebf5b5bd17d091"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.224326 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fa87-account-create-nn2fj" event={"ID":"ba0e4040-2d4f-423f-8540-368a4c49bd74","Type":"ContainerStarted","Data":"8c2758c59c1334bde42188b0ec527f58105d8c3fc9d84adecc9d8d7b9692e239"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.228556 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nttj" event={"ID":"175f1421-f4ac-4bc9-b3f6-fa5860f556b4","Type":"ContainerStarted","Data":"fbeeab8dfd52821760f76608e32798299064703492a600dea047146e083b7a3b"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.234591 4846 generic.go:334] "Generic (PLEG): container finished" podID="1e293562-6940-4e74-90c4-a57ba16599ef" containerID="b8ce0bf5fdf5fb7df3baaeb39a7a5691b8bae7beaf3f7aba54c339d0dd4ac80f" exitCode=0 Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.234668 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t2mkn" event={"ID":"1e293562-6940-4e74-90c4-a57ba16599ef","Type":"ContainerDied","Data":"b8ce0bf5fdf5fb7df3baaeb39a7a5691b8bae7beaf3f7aba54c339d0dd4ac80f"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.234696 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t2mkn" event={"ID":"1e293562-6940-4e74-90c4-a57ba16599ef","Type":"ContainerStarted","Data":"1734730209e63e21ef632b1aedefbbd9ca8ba9556cc8bef8aab4b0f7053bc6a5"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.245141 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"d728a2563ea058782693f619d0fa7f7556e9e1dc6926c5c2d98f3e0991cca43e"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.245215 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"902545795ed45e2840f326d9f107e2ec79d5de0a23aeefe70dd25433f12ebe93"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.245233 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"06b8e43d4727ae24a01a2c8e6ddbc4c8418f0932a94990883815a4f673daed6a"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.261585 4846 generic.go:334] "Generic (PLEG): container finished" podID="792e14dc-fdaa-4ea2-a71d-6bef55b41871" containerID="ed4a0c24553ebc438a1494550f18b15903db25b9b2741f7c1a482648530cc12a" exitCode=0 Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.262085 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8zbc6" event={"ID":"792e14dc-fdaa-4ea2-a71d-6bef55b41871","Type":"ContainerDied","Data":"ed4a0c24553ebc438a1494550f18b15903db25b9b2741f7c1a482648530cc12a"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.262158 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8zbc6" event={"ID":"792e14dc-fdaa-4ea2-a71d-6bef55b41871","Type":"ContainerStarted","Data":"c7f6a46e42b7e7b3859fe68be2acb511779f233bd347ba60d0a06792d9036386"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.266175 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e57-account-create-sbb6n" event={"ID":"9ec8aed9-00f5-4a29-af65-f87ab06bdda5","Type":"ContainerStarted","Data":"7d6a1861e0eddf446083dd7189b30b4479b51e25f0f5860f3ebf71781945b3f7"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.266244 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e57-account-create-sbb6n" event={"ID":"9ec8aed9-00f5-4a29-af65-f87ab06bdda5","Type":"ContainerStarted","Data":"e5580be2d08e8507d32c93d3e985e617af18ed9dcc633f86f3a79f8f797675de"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.267601 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mdpsq" event={"ID":"2b2cb86a-7325-4151-9b4c-b8af3060b82a","Type":"ContainerStarted","Data":"97adf0ffd581501bf454dbee296281151870069ea39674c160a4af6904ce4ccf"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.267662 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mdpsq" event={"ID":"2b2cb86a-7325-4151-9b4c-b8af3060b82a","Type":"ContainerStarted","Data":"d1aecfe517d7cb6c15fa4cb84293a50a6ab9a527e5092dd568b3ba70f1cf1155"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.273933 4846 generic.go:334] "Generic (PLEG): container finished" podID="413f1d30-1d47-47b2-a954-91b8ed0134f3" containerID="04533a3855c349d95bfaf88aa0774798a6e464da7aaf4f31703417377346ff0a" exitCode=0 Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.273991 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ed51-account-create-rvllg" event={"ID":"413f1d30-1d47-47b2-a954-91b8ed0134f3","Type":"ContainerDied","Data":"04533a3855c349d95bfaf88aa0774798a6e464da7aaf4f31703417377346ff0a"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.274020 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ed51-account-create-rvllg" event={"ID":"413f1d30-1d47-47b2-a954-91b8ed0134f3","Type":"ContainerStarted","Data":"7fd40b5bab4b790cb761754c5d374eb5bbb5259e04fd72a4cec128c883327db2"} Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.322642 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-mdpsq" podStartSLOduration=2.322614599 podStartE2EDuration="2.322614599s" podCreationTimestamp="2025-11-22 09:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:03.320373853 +0000 UTC m=+1038.256063502" watchObservedRunningTime="2025-11-22 09:31:03.322614599 +0000 UTC m=+1038.258304248" Nov 22 09:31:03 crc kubenswrapper[4846]: I1122 09:31:03.340649 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-0e57-account-create-sbb6n" podStartSLOduration=2.340629597 podStartE2EDuration="2.340629597s" podCreationTimestamp="2025-11-22 09:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:03.334404914 +0000 UTC m=+1038.270094563" watchObservedRunningTime="2025-11-22 09:31:03.340629597 +0000 UTC m=+1038.276319246" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.290622 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"7ba5bbf9242c408acd8d31b17ffa4bdc9ba9f3118226e19cd36fd520900eac6b"} Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.294143 4846 generic.go:334] "Generic (PLEG): container finished" podID="9ec8aed9-00f5-4a29-af65-f87ab06bdda5" containerID="7d6a1861e0eddf446083dd7189b30b4479b51e25f0f5860f3ebf71781945b3f7" exitCode=0 Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.294233 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e57-account-create-sbb6n" event={"ID":"9ec8aed9-00f5-4a29-af65-f87ab06bdda5","Type":"ContainerDied","Data":"7d6a1861e0eddf446083dd7189b30b4479b51e25f0f5860f3ebf71781945b3f7"} Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.297767 4846 generic.go:334] "Generic (PLEG): container finished" podID="2b2cb86a-7325-4151-9b4c-b8af3060b82a" containerID="97adf0ffd581501bf454dbee296281151870069ea39674c160a4af6904ce4ccf" exitCode=0 Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.298089 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mdpsq" event={"ID":"2b2cb86a-7325-4151-9b4c-b8af3060b82a","Type":"ContainerDied","Data":"97adf0ffd581501bf454dbee296281151870069ea39674c160a4af6904ce4ccf"} Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.707362 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.789127 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e293562-6940-4e74-90c4-a57ba16599ef-operator-scripts\") pod \"1e293562-6940-4e74-90c4-a57ba16599ef\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.789311 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz2ns\" (UniqueName: \"kubernetes.io/projected/1e293562-6940-4e74-90c4-a57ba16599ef-kube-api-access-fz2ns\") pod \"1e293562-6940-4e74-90c4-a57ba16599ef\" (UID: \"1e293562-6940-4e74-90c4-a57ba16599ef\") " Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.790027 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e293562-6940-4e74-90c4-a57ba16599ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e293562-6940-4e74-90c4-a57ba16599ef" (UID: "1e293562-6940-4e74-90c4-a57ba16599ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.790748 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e293562-6940-4e74-90c4-a57ba16599ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.806447 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e293562-6940-4e74-90c4-a57ba16599ef-kube-api-access-fz2ns" (OuterVolumeSpecName: "kube-api-access-fz2ns") pod "1e293562-6940-4e74-90c4-a57ba16599ef" (UID: "1e293562-6940-4e74-90c4-a57ba16599ef"). InnerVolumeSpecName "kube-api-access-fz2ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.866150 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.885694 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.889709 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.900118 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0e4040-2d4f-423f-8540-368a4c49bd74-operator-scripts\") pod \"ba0e4040-2d4f-423f-8540-368a4c49bd74\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.900198 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp786\" (UniqueName: \"kubernetes.io/projected/ba0e4040-2d4f-423f-8540-368a4c49bd74-kube-api-access-mp786\") pod \"ba0e4040-2d4f-423f-8540-368a4c49bd74\" (UID: \"ba0e4040-2d4f-423f-8540-368a4c49bd74\") " Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.900582 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz2ns\" (UniqueName: \"kubernetes.io/projected/1e293562-6940-4e74-90c4-a57ba16599ef-kube-api-access-fz2ns\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.901181 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0e4040-2d4f-423f-8540-368a4c49bd74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba0e4040-2d4f-423f-8540-368a4c49bd74" (UID: "ba0e4040-2d4f-423f-8540-368a4c49bd74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:04 crc kubenswrapper[4846]: I1122 09:31:04.906967 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0e4040-2d4f-423f-8540-368a4c49bd74-kube-api-access-mp786" (OuterVolumeSpecName: "kube-api-access-mp786") pod "ba0e4040-2d4f-423f-8540-368a4c49bd74" (UID: "ba0e4040-2d4f-423f-8540-368a4c49bd74"). InnerVolumeSpecName "kube-api-access-mp786". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.002711 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vw27\" (UniqueName: \"kubernetes.io/projected/792e14dc-fdaa-4ea2-a71d-6bef55b41871-kube-api-access-2vw27\") pod \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.002797 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfff5\" (UniqueName: \"kubernetes.io/projected/413f1d30-1d47-47b2-a954-91b8ed0134f3-kube-api-access-xfff5\") pod \"413f1d30-1d47-47b2-a954-91b8ed0134f3\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.003096 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413f1d30-1d47-47b2-a954-91b8ed0134f3-operator-scripts\") pod \"413f1d30-1d47-47b2-a954-91b8ed0134f3\" (UID: \"413f1d30-1d47-47b2-a954-91b8ed0134f3\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.003146 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792e14dc-fdaa-4ea2-a71d-6bef55b41871-operator-scripts\") pod \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\" (UID: \"792e14dc-fdaa-4ea2-a71d-6bef55b41871\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.003582 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba0e4040-2d4f-423f-8540-368a4c49bd74-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.003606 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp786\" (UniqueName: \"kubernetes.io/projected/ba0e4040-2d4f-423f-8540-368a4c49bd74-kube-api-access-mp786\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.004124 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792e14dc-fdaa-4ea2-a71d-6bef55b41871-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "792e14dc-fdaa-4ea2-a71d-6bef55b41871" (UID: "792e14dc-fdaa-4ea2-a71d-6bef55b41871"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.007178 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413f1d30-1d47-47b2-a954-91b8ed0134f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "413f1d30-1d47-47b2-a954-91b8ed0134f3" (UID: "413f1d30-1d47-47b2-a954-91b8ed0134f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.013147 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413f1d30-1d47-47b2-a954-91b8ed0134f3-kube-api-access-xfff5" (OuterVolumeSpecName: "kube-api-access-xfff5") pod "413f1d30-1d47-47b2-a954-91b8ed0134f3" (UID: "413f1d30-1d47-47b2-a954-91b8ed0134f3"). InnerVolumeSpecName "kube-api-access-xfff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.017174 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792e14dc-fdaa-4ea2-a71d-6bef55b41871-kube-api-access-2vw27" (OuterVolumeSpecName: "kube-api-access-2vw27") pod "792e14dc-fdaa-4ea2-a71d-6bef55b41871" (UID: "792e14dc-fdaa-4ea2-a71d-6bef55b41871"). InnerVolumeSpecName "kube-api-access-2vw27". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.105314 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413f1d30-1d47-47b2-a954-91b8ed0134f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.105359 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792e14dc-fdaa-4ea2-a71d-6bef55b41871-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.105369 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vw27\" (UniqueName: \"kubernetes.io/projected/792e14dc-fdaa-4ea2-a71d-6bef55b41871-kube-api-access-2vw27\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.105380 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfff5\" (UniqueName: \"kubernetes.io/projected/413f1d30-1d47-47b2-a954-91b8ed0134f3-kube-api-access-xfff5\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.308174 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-t2mkn" event={"ID":"1e293562-6940-4e74-90c4-a57ba16599ef","Type":"ContainerDied","Data":"1734730209e63e21ef632b1aedefbbd9ca8ba9556cc8bef8aab4b0f7053bc6a5"} Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.308229 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1734730209e63e21ef632b1aedefbbd9ca8ba9556cc8bef8aab4b0f7053bc6a5" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.308197 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-t2mkn" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.311953 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8zbc6" event={"ID":"792e14dc-fdaa-4ea2-a71d-6bef55b41871","Type":"ContainerDied","Data":"c7f6a46e42b7e7b3859fe68be2acb511779f233bd347ba60d0a06792d9036386"} Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.312010 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f6a46e42b7e7b3859fe68be2acb511779f233bd347ba60d0a06792d9036386" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.312028 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8zbc6" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.313626 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ed51-account-create-rvllg" event={"ID":"413f1d30-1d47-47b2-a954-91b8ed0134f3","Type":"ContainerDied","Data":"7fd40b5bab4b790cb761754c5d374eb5bbb5259e04fd72a4cec128c883327db2"} Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.313692 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fd40b5bab4b790cb761754c5d374eb5bbb5259e04fd72a4cec128c883327db2" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.313641 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ed51-account-create-rvllg" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.322641 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fa87-account-create-nn2fj" event={"ID":"ba0e4040-2d4f-423f-8540-368a4c49bd74","Type":"ContainerDied","Data":"8c2758c59c1334bde42188b0ec527f58105d8c3fc9d84adecc9d8d7b9692e239"} Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.322678 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa87-account-create-nn2fj" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.322704 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c2758c59c1334bde42188b0ec527f58105d8c3fc9d84adecc9d8d7b9692e239" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.740720 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.740775 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.822782 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cb86a-7325-4151-9b4c-b8af3060b82a-operator-scripts\") pod \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.822934 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-operator-scripts\") pod \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.823107 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftpq8\" (UniqueName: \"kubernetes.io/projected/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-kube-api-access-ftpq8\") pod \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\" (UID: \"9ec8aed9-00f5-4a29-af65-f87ab06bdda5\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.823209 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95ml\" (UniqueName: \"kubernetes.io/projected/2b2cb86a-7325-4151-9b4c-b8af3060b82a-kube-api-access-w95ml\") pod \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\" (UID: \"2b2cb86a-7325-4151-9b4c-b8af3060b82a\") " Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.823987 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2cb86a-7325-4151-9b4c-b8af3060b82a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b2cb86a-7325-4151-9b4c-b8af3060b82a" (UID: "2b2cb86a-7325-4151-9b4c-b8af3060b82a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.824026 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ec8aed9-00f5-4a29-af65-f87ab06bdda5" (UID: "9ec8aed9-00f5-4a29-af65-f87ab06bdda5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.833165 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2cb86a-7325-4151-9b4c-b8af3060b82a-kube-api-access-w95ml" (OuterVolumeSpecName: "kube-api-access-w95ml") pod "2b2cb86a-7325-4151-9b4c-b8af3060b82a" (UID: "2b2cb86a-7325-4151-9b4c-b8af3060b82a"). InnerVolumeSpecName "kube-api-access-w95ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.846184 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-kube-api-access-ftpq8" (OuterVolumeSpecName: "kube-api-access-ftpq8") pod "9ec8aed9-00f5-4a29-af65-f87ab06bdda5" (UID: "9ec8aed9-00f5-4a29-af65-f87ab06bdda5"). InnerVolumeSpecName "kube-api-access-ftpq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.927844 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95ml\" (UniqueName: \"kubernetes.io/projected/2b2cb86a-7325-4151-9b4c-b8af3060b82a-kube-api-access-w95ml\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.927886 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2cb86a-7325-4151-9b4c-b8af3060b82a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.927897 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:05 crc kubenswrapper[4846]: I1122 09:31:05.927907 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftpq8\" (UniqueName: \"kubernetes.io/projected/9ec8aed9-00f5-4a29-af65-f87ab06bdda5-kube-api-access-ftpq8\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.338689 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"5597914ba07ba2614d8262ec0ebd1a340acad46723eff84fca73032e7ac31bcd"} Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.340446 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0e57-account-create-sbb6n" event={"ID":"9ec8aed9-00f5-4a29-af65-f87ab06bdda5","Type":"ContainerDied","Data":"e5580be2d08e8507d32c93d3e985e617af18ed9dcc633f86f3a79f8f797675de"} Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.340799 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5580be2d08e8507d32c93d3e985e617af18ed9dcc633f86f3a79f8f797675de" Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.340855 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0e57-account-create-sbb6n" Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.343060 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mdpsq" event={"ID":"2b2cb86a-7325-4151-9b4c-b8af3060b82a","Type":"ContainerDied","Data":"d1aecfe517d7cb6c15fa4cb84293a50a6ab9a527e5092dd568b3ba70f1cf1155"} Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.343082 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1aecfe517d7cb6c15fa4cb84293a50a6ab9a527e5092dd568b3ba70f1cf1155" Nov 22 09:31:06 crc kubenswrapper[4846]: I1122 09:31:06.343125 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mdpsq" Nov 22 09:31:09 crc kubenswrapper[4846]: I1122 09:31:09.383386 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"b1a0c505e4285d4fc4632fd350ba4ef6eb042633dbfa55990662957f2c16fc0e"} Nov 22 09:31:09 crc kubenswrapper[4846]: I1122 09:31:09.383869 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"99a9998b49e95041da0ed7d2d2653477e44f3aa2504c3286d4b9c81e5fc06c39"} Nov 22 09:31:09 crc kubenswrapper[4846]: I1122 09:31:09.385787 4846 generic.go:334] "Generic (PLEG): container finished" podID="bf68f3c4-7d31-4738-8a2f-97e24d184a29" containerID="61733e54616f04475ee736b0b982cbe49c948a4518fb2dd424d1dae3f6529ba3" exitCode=0 Nov 22 09:31:09 crc kubenswrapper[4846]: I1122 09:31:09.385863 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sdgn9" event={"ID":"bf68f3c4-7d31-4738-8a2f-97e24d184a29","Type":"ContainerDied","Data":"61733e54616f04475ee736b0b982cbe49c948a4518fb2dd424d1dae3f6529ba3"} Nov 22 09:31:09 crc kubenswrapper[4846]: I1122 09:31:09.388893 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nttj" event={"ID":"175f1421-f4ac-4bc9-b3f6-fa5860f556b4","Type":"ContainerStarted","Data":"734062ff9482717a76d935520e9bc3ce7897edcfa111add693f9048416d5c458"} Nov 22 09:31:09 crc kubenswrapper[4846]: I1122 09:31:09.440330 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4nttj" podStartSLOduration=2.513753069 podStartE2EDuration="8.440306177s" podCreationTimestamp="2025-11-22 09:31:01 +0000 UTC" firstStartedPulling="2025-11-22 09:31:02.85298259 +0000 UTC m=+1037.788672239" lastFinishedPulling="2025-11-22 09:31:08.779535668 +0000 UTC m=+1043.715225347" observedRunningTime="2025-11-22 09:31:09.426693238 +0000 UTC m=+1044.362382907" watchObservedRunningTime="2025-11-22 09:31:09.440306177 +0000 UTC m=+1044.375995826" Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.405629 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"e45db03dab9498a1f32f9e03f9f1999fcf91f8905261f3693e460108c6db9566"} Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.835832 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sdgn9" Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.969195 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-combined-ca-bundle\") pod \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.969254 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjg4p\" (UniqueName: \"kubernetes.io/projected/bf68f3c4-7d31-4738-8a2f-97e24d184a29-kube-api-access-gjg4p\") pod \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.970140 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-db-sync-config-data\") pod \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.970543 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-config-data\") pod \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\" (UID: \"bf68f3c4-7d31-4738-8a2f-97e24d184a29\") " Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.976169 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bf68f3c4-7d31-4738-8a2f-97e24d184a29" (UID: "bf68f3c4-7d31-4738-8a2f-97e24d184a29"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:31:10 crc kubenswrapper[4846]: I1122 09:31:10.978381 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf68f3c4-7d31-4738-8a2f-97e24d184a29-kube-api-access-gjg4p" (OuterVolumeSpecName: "kube-api-access-gjg4p") pod "bf68f3c4-7d31-4738-8a2f-97e24d184a29" (UID: "bf68f3c4-7d31-4738-8a2f-97e24d184a29"). InnerVolumeSpecName "kube-api-access-gjg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.000500 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf68f3c4-7d31-4738-8a2f-97e24d184a29" (UID: "bf68f3c4-7d31-4738-8a2f-97e24d184a29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.022679 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-config-data" (OuterVolumeSpecName: "config-data") pod "bf68f3c4-7d31-4738-8a2f-97e24d184a29" (UID: "bf68f3c4-7d31-4738-8a2f-97e24d184a29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.072484 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.072522 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjg4p\" (UniqueName: \"kubernetes.io/projected/bf68f3c4-7d31-4738-8a2f-97e24d184a29-kube-api-access-gjg4p\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.072534 4846 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.072542 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf68f3c4-7d31-4738-8a2f-97e24d184a29-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.420251 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"779af759e07f9223fcd04004319737c0ea06afc0a7289ce116b880bb5f84ba0d"} Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.420323 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"d6fce917287ae5304cb0a70027f33fa2b1d840c907c9efb73b3f0f33d6e5c0f8"} Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.420337 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"8c1dc5ba7665e9b6b13151b16963f773aa8bcc195af72fc6d86bb9b83d192503"} Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.420351 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"b24127b9c9e35c427d2c3c62953d5b2598408d099149941826735df0f3337795"} Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.421941 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sdgn9" event={"ID":"bf68f3c4-7d31-4738-8a2f-97e24d184a29","Type":"ContainerDied","Data":"2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d"} Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.421965 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.422062 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sdgn9" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.547028 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf68f3c4_7d31_4738_8a2f_97e24d184a29.slice/crio-2fc55a33d0a5f6fb8772142ea96e102119ed847f98328b3f2e9bc1fc9de0d18d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf68f3c4_7d31_4738_8a2f_97e24d184a29.slice\": RecentStats: unable to find data in memory cache]" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.851514 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cv9l8"] Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854644 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792e14dc-fdaa-4ea2-a71d-6bef55b41871" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854673 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="792e14dc-fdaa-4ea2-a71d-6bef55b41871" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854690 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec8aed9-00f5-4a29-af65-f87ab06bdda5" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854696 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec8aed9-00f5-4a29-af65-f87ab06bdda5" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854722 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e293562-6940-4e74-90c4-a57ba16599ef" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854728 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e293562-6940-4e74-90c4-a57ba16599ef" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854743 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba0e4040-2d4f-423f-8540-368a4c49bd74" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854749 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0e4040-2d4f-423f-8540-368a4c49bd74" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854764 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413f1d30-1d47-47b2-a954-91b8ed0134f3" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854772 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="413f1d30-1d47-47b2-a954-91b8ed0134f3" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854785 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf68f3c4-7d31-4738-8a2f-97e24d184a29" containerName="glance-db-sync" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854791 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf68f3c4-7d31-4738-8a2f-97e24d184a29" containerName="glance-db-sync" Nov 22 09:31:11 crc kubenswrapper[4846]: E1122 09:31:11.854807 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2cb86a-7325-4151-9b4c-b8af3060b82a" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.854814 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2cb86a-7325-4151-9b4c-b8af3060b82a" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855011 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec8aed9-00f5-4a29-af65-f87ab06bdda5" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855025 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="792e14dc-fdaa-4ea2-a71d-6bef55b41871" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855034 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="413f1d30-1d47-47b2-a954-91b8ed0134f3" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855059 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba0e4040-2d4f-423f-8540-368a4c49bd74" containerName="mariadb-account-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855073 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2cb86a-7325-4151-9b4c-b8af3060b82a" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855084 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e293562-6940-4e74-90c4-a57ba16599ef" containerName="mariadb-database-create" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.855098 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf68f3c4-7d31-4738-8a2f-97e24d184a29" containerName="glance-db-sync" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.856306 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:11 crc kubenswrapper[4846]: I1122 09:31:11.870200 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cv9l8"] Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.005798 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.005866 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-config\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.005909 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprgq\" (UniqueName: \"kubernetes.io/projected/fff32d09-37f8-4e71-944d-1bb60c9f24f2-kube-api-access-zprgq\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.005926 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.005998 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.107497 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.107574 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-config\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.107609 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zprgq\" (UniqueName: \"kubernetes.io/projected/fff32d09-37f8-4e71-944d-1bb60c9f24f2-kube-api-access-zprgq\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.107628 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.107723 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.109058 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.109076 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.109516 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.109679 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-config\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.130176 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprgq\" (UniqueName: \"kubernetes.io/projected/fff32d09-37f8-4e71-944d-1bb60c9f24f2-kube-api-access-zprgq\") pod \"dnsmasq-dns-5b946c75cc-cv9l8\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.196631 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.440854 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"41886e4758b40d1476ba494d083f3c0bf9bcdd16708572137acb8aa0d807d9bb"} Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.441260 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"665dd94d8e898186b6e533202d42fafcdf5e1fe5c8ca3ecaa2d1c6275075b835"} Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.441273 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"575c6d2b-ae18-48ec-a314-211ccd078d87","Type":"ContainerStarted","Data":"4cf5bb3c76dd9939c771de4dd75b93b02b43a8edc36671faa2f4a5e51c3d14fb"} Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.499899 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.072944277 podStartE2EDuration="46.499869092s" podCreationTimestamp="2025-11-22 09:30:26 +0000 UTC" firstStartedPulling="2025-11-22 09:31:00.03834772 +0000 UTC m=+1034.974037389" lastFinishedPulling="2025-11-22 09:31:10.465272535 +0000 UTC m=+1045.400962204" observedRunningTime="2025-11-22 09:31:12.488657213 +0000 UTC m=+1047.424346852" watchObservedRunningTime="2025-11-22 09:31:12.499869092 +0000 UTC m=+1047.435558741" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.742970 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cv9l8"] Nov 22 09:31:12 crc kubenswrapper[4846]: W1122 09:31:12.751146 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfff32d09_37f8_4e71_944d_1bb60c9f24f2.slice/crio-f60453ea10e2f90cb3ff580092ba3ba0d56c87b561d0a3f42d77b52a813b63a4 WatchSource:0}: Error finding container f60453ea10e2f90cb3ff580092ba3ba0d56c87b561d0a3f42d77b52a813b63a4: Status 404 returned error can't find the container with id f60453ea10e2f90cb3ff580092ba3ba0d56c87b561d0a3f42d77b52a813b63a4 Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.790322 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cv9l8"] Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.832730 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gggcj"] Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.838581 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.847618 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gggcj"] Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.847810 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.924614 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.924970 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-config\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.925085 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9d9f\" (UniqueName: \"kubernetes.io/projected/11494284-5130-4203-8185-91958a668040-kube-api-access-d9d9f\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.925223 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.925309 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:12 crc kubenswrapper[4846]: I1122 09:31:12.925490 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.029425 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.029577 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.029618 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-config\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.029664 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9d9f\" (UniqueName: \"kubernetes.io/projected/11494284-5130-4203-8185-91958a668040-kube-api-access-d9d9f\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.029737 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.029778 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.032734 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.037875 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.037913 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.038230 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-config\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.038412 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.052704 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9d9f\" (UniqueName: \"kubernetes.io/projected/11494284-5130-4203-8185-91958a668040-kube-api-access-d9d9f\") pod \"dnsmasq-dns-74f6bcbc87-gggcj\" (UID: \"11494284-5130-4203-8185-91958a668040\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.260853 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.462132 4846 generic.go:334] "Generic (PLEG): container finished" podID="fff32d09-37f8-4e71-944d-1bb60c9f24f2" containerID="43b0e85c9a2e6bfd2e28f0a72e96e23cce03cdc8f119d43f7ea553fbd7e00d71" exitCode=0 Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.462284 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" event={"ID":"fff32d09-37f8-4e71-944d-1bb60c9f24f2","Type":"ContainerDied","Data":"43b0e85c9a2e6bfd2e28f0a72e96e23cce03cdc8f119d43f7ea553fbd7e00d71"} Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.462591 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" event={"ID":"fff32d09-37f8-4e71-944d-1bb60c9f24f2","Type":"ContainerStarted","Data":"f60453ea10e2f90cb3ff580092ba3ba0d56c87b561d0a3f42d77b52a813b63a4"} Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.469900 4846 generic.go:334] "Generic (PLEG): container finished" podID="175f1421-f4ac-4bc9-b3f6-fa5860f556b4" containerID="734062ff9482717a76d935520e9bc3ce7897edcfa111add693f9048416d5c458" exitCode=0 Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.470196 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nttj" event={"ID":"175f1421-f4ac-4bc9-b3f6-fa5860f556b4","Type":"ContainerDied","Data":"734062ff9482717a76d935520e9bc3ce7897edcfa111add693f9048416d5c458"} Nov 22 09:31:13 crc kubenswrapper[4846]: I1122 09:31:13.754954 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gggcj"] Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.036224 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.173396 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-dns-svc\") pod \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.173535 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-nb\") pod \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.173596 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zprgq\" (UniqueName: \"kubernetes.io/projected/fff32d09-37f8-4e71-944d-1bb60c9f24f2-kube-api-access-zprgq\") pod \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.173657 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-config\") pod \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.173764 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-sb\") pod \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\" (UID: \"fff32d09-37f8-4e71-944d-1bb60c9f24f2\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.187639 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff32d09-37f8-4e71-944d-1bb60c9f24f2-kube-api-access-zprgq" (OuterVolumeSpecName: "kube-api-access-zprgq") pod "fff32d09-37f8-4e71-944d-1bb60c9f24f2" (UID: "fff32d09-37f8-4e71-944d-1bb60c9f24f2"). InnerVolumeSpecName "kube-api-access-zprgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.205955 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fff32d09-37f8-4e71-944d-1bb60c9f24f2" (UID: "fff32d09-37f8-4e71-944d-1bb60c9f24f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.213538 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-config" (OuterVolumeSpecName: "config") pod "fff32d09-37f8-4e71-944d-1bb60c9f24f2" (UID: "fff32d09-37f8-4e71-944d-1bb60c9f24f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.223766 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fff32d09-37f8-4e71-944d-1bb60c9f24f2" (UID: "fff32d09-37f8-4e71-944d-1bb60c9f24f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.235672 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fff32d09-37f8-4e71-944d-1bb60c9f24f2" (UID: "fff32d09-37f8-4e71-944d-1bb60c9f24f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.275395 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.275653 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zprgq\" (UniqueName: \"kubernetes.io/projected/fff32d09-37f8-4e71-944d-1bb60c9f24f2-kube-api-access-zprgq\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.275758 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.275818 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.275886 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff32d09-37f8-4e71-944d-1bb60c9f24f2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.483398 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" event={"ID":"fff32d09-37f8-4e71-944d-1bb60c9f24f2","Type":"ContainerDied","Data":"f60453ea10e2f90cb3ff580092ba3ba0d56c87b561d0a3f42d77b52a813b63a4"} Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.483441 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cv9l8" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.483490 4846 scope.go:117] "RemoveContainer" containerID="43b0e85c9a2e6bfd2e28f0a72e96e23cce03cdc8f119d43f7ea553fbd7e00d71" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.486270 4846 generic.go:334] "Generic (PLEG): container finished" podID="11494284-5130-4203-8185-91958a668040" containerID="4788be4bd23f1a2c1c823f2a3357dc694cc33fc06e02a41e80d0c016477bf2c0" exitCode=0 Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.486430 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" event={"ID":"11494284-5130-4203-8185-91958a668040","Type":"ContainerDied","Data":"4788be4bd23f1a2c1c823f2a3357dc694cc33fc06e02a41e80d0c016477bf2c0"} Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.486570 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" event={"ID":"11494284-5130-4203-8185-91958a668040","Type":"ContainerStarted","Data":"4cec489f4bf2119e2ea3cffaaae0b207d787782d081799d0c5bf3f5145149d01"} Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.579310 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cv9l8"] Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.586833 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cv9l8"] Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.877211 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.988556 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-config-data\") pod \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.988805 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-combined-ca-bundle\") pod \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.989000 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkx8j\" (UniqueName: \"kubernetes.io/projected/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-kube-api-access-nkx8j\") pod \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\" (UID: \"175f1421-f4ac-4bc9-b3f6-fa5860f556b4\") " Nov 22 09:31:14 crc kubenswrapper[4846]: I1122 09:31:14.992992 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-kube-api-access-nkx8j" (OuterVolumeSpecName: "kube-api-access-nkx8j") pod "175f1421-f4ac-4bc9-b3f6-fa5860f556b4" (UID: "175f1421-f4ac-4bc9-b3f6-fa5860f556b4"). InnerVolumeSpecName "kube-api-access-nkx8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.018283 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "175f1421-f4ac-4bc9-b3f6-fa5860f556b4" (UID: "175f1421-f4ac-4bc9-b3f6-fa5860f556b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.051389 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-config-data" (OuterVolumeSpecName: "config-data") pod "175f1421-f4ac-4bc9-b3f6-fa5860f556b4" (UID: "175f1421-f4ac-4bc9-b3f6-fa5860f556b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.092533 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.092576 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkx8j\" (UniqueName: \"kubernetes.io/projected/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-kube-api-access-nkx8j\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.092590 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/175f1421-f4ac-4bc9-b3f6-fa5860f556b4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.498870 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nttj" event={"ID":"175f1421-f4ac-4bc9-b3f6-fa5860f556b4","Type":"ContainerDied","Data":"fbeeab8dfd52821760f76608e32798299064703492a600dea047146e083b7a3b"} Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.498900 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nttj" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.498922 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbeeab8dfd52821760f76608e32798299064703492a600dea047146e083b7a3b" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.501213 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" event={"ID":"11494284-5130-4203-8185-91958a668040","Type":"ContainerStarted","Data":"373fe77124db74b63854d6baa02e4f8690f2baae857975c03503918f4d0899fa"} Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.501394 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.534321 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" podStartSLOduration=3.53429253 podStartE2EDuration="3.53429253s" podCreationTimestamp="2025-11-22 09:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:15.525568445 +0000 UTC m=+1050.461258094" watchObservedRunningTime="2025-11-22 09:31:15.53429253 +0000 UTC m=+1050.469982179" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.708494 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gggcj"] Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.718674 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wx8lz"] Nov 22 09:31:15 crc kubenswrapper[4846]: E1122 09:31:15.722947 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175f1421-f4ac-4bc9-b3f6-fa5860f556b4" containerName="keystone-db-sync" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.722989 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="175f1421-f4ac-4bc9-b3f6-fa5860f556b4" containerName="keystone-db-sync" Nov 22 09:31:15 crc kubenswrapper[4846]: E1122 09:31:15.723091 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff32d09-37f8-4e71-944d-1bb60c9f24f2" containerName="init" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.723100 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff32d09-37f8-4e71-944d-1bb60c9f24f2" containerName="init" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.723382 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff32d09-37f8-4e71-944d-1bb60c9f24f2" containerName="init" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.723406 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="175f1421-f4ac-4bc9-b3f6-fa5860f556b4" containerName="keystone-db-sync" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.724095 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.724578 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wx8lz"] Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.727027 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d54gt" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.728173 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.728420 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.728604 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.733464 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.805138 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7tlsb"] Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.806845 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.816606 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4l5\" (UniqueName: \"kubernetes.io/projected/428b3f4b-ed9c-4e57-b374-65ec61230d39-kube-api-access-2m4l5\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.816758 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-combined-ca-bundle\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.816877 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-credential-keys\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.816975 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-scripts\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.817067 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-config-data\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.817181 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-fernet-keys\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.843443 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7tlsb"] Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.916346 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9m5n9"] Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918640 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-scripts\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918721 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-config\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918745 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-config-data\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918777 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwws\" (UniqueName: \"kubernetes.io/projected/87d707ea-60ff-4f96-acc8-b99fdd56cf03-kube-api-access-fcwws\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918800 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918847 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-svc\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918868 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918889 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-fernet-keys\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918908 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4l5\" (UniqueName: \"kubernetes.io/projected/428b3f4b-ed9c-4e57-b374-65ec61230d39-kube-api-access-2m4l5\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918944 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-combined-ca-bundle\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.918984 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-credential-keys\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.919018 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.920535 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.925524 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-scripts\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.925912 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-credential-keys\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.931284 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.932020 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qkc66" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.932285 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.932923 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-fernet-keys\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.933623 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-combined-ca-bundle\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.968585 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7965d89547-s58k6"] Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.970261 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.979502 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.979789 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-26rmp" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.979972 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.980029 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.981124 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4l5\" (UniqueName: \"kubernetes.io/projected/428b3f4b-ed9c-4e57-b374-65ec61230d39-kube-api-access-2m4l5\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.990734 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-config-data\") pod \"keystone-bootstrap-wx8lz\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:15 crc kubenswrapper[4846]: I1122 09:31:15.990825 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9m5n9"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.014122 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xgp99"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.015459 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.018949 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.062321 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.081251 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.088983 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zqsc4" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.228964 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgsg\" (UniqueName: \"kubernetes.io/projected/584aeb0f-b1a9-4a6e-b129-b21593065b18-kube-api-access-ldgsg\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.229070 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-config\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.229123 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-config\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.229226 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.229252 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwws\" (UniqueName: \"kubernetes.io/projected/87d707ea-60ff-4f96-acc8-b99fdd56cf03-kube-api-access-fcwws\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.229375 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-svc\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.229403 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-combined-ca-bundle\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.231522 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-config\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.233830 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-svc\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.237297 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.247975 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.264009 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.264523 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.266221 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.308414 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff32d09-37f8-4e71-944d-1bb60c9f24f2" path="/var/lib/kubelet/pods/fff32d09-37f8-4e71-944d-1bb60c9f24f2/volumes" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.333709 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7965d89547-s58k6"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.333749 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xgp99"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.333763 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xh58j"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.351115 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwws\" (UniqueName: \"kubernetes.io/projected/87d707ea-60ff-4f96-acc8-b99fdd56cf03-kube-api-access-fcwws\") pod \"dnsmasq-dns-847c4cc679-7tlsb\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.351382 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.358568 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.358789 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.359022 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4l2sw" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.367441 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-db-sync-config-data\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.367528 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-scripts\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.367573 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnp7w\" (UniqueName: \"kubernetes.io/projected/13948906-431d-4990-b5be-32a21b8113e9-kube-api-access-jnp7w\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.367605 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldgsg\" (UniqueName: \"kubernetes.io/projected/584aeb0f-b1a9-4a6e-b129-b21593065b18-kube-api-access-ldgsg\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.367632 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-config\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.367676 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/13948906-431d-4990-b5be-32a21b8113e9-horizon-secret-key\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.380996 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-config-data\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.380830 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7tlsb"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.381823 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382158 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-config-data\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382192 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-combined-ca-bundle\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382236 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-combined-ca-bundle\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382266 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hczs\" (UniqueName: \"kubernetes.io/projected/083da0b8-38d6-4eab-b211-8389df97a0a8-kube-api-access-9hczs\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382297 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-scripts\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382317 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13948906-431d-4990-b5be-32a21b8113e9-logs\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.382364 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/083da0b8-38d6-4eab-b211-8389df97a0a8-etc-machine-id\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.395013 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-config\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.399420 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldgsg\" (UniqueName: \"kubernetes.io/projected/584aeb0f-b1a9-4a6e-b129-b21593065b18-kube-api-access-ldgsg\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.411885 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-combined-ca-bundle\") pod \"neutron-db-sync-9m5n9\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.424334 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.424937 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xh58j"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.436908 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.439382 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.441954 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.444292 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.444559 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.447916 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-vz6qx"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.449224 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.455332 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mtz29"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.461703 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.468943 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cq95r" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.469384 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.478364 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f84b6b857-6zgx9"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.480336 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495561 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495606 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-log-httpd\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495645 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hczs\" (UniqueName: \"kubernetes.io/projected/083da0b8-38d6-4eab-b211-8389df97a0a8-kube-api-access-9hczs\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495666 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-scripts\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495687 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlljj\" (UniqueName: \"kubernetes.io/projected/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-kube-api-access-vlljj\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495708 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-scripts\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495726 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13948906-431d-4990-b5be-32a21b8113e9-logs\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495752 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/083da0b8-38d6-4eab-b211-8389df97a0a8-etc-machine-id\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495788 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-scripts\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495816 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-combined-ca-bundle\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495834 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495854 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-db-sync-config-data\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495883 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c8a1c0-2155-4d68-971a-e68aff9e5133-logs\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495911 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-config-data\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495937 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-config-data\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495957 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-scripts\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.495985 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnp7w\" (UniqueName: \"kubernetes.io/projected/13948906-431d-4990-b5be-32a21b8113e9-kube-api-access-jnp7w\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.496021 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-run-httpd\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.496057 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/13948906-431d-4990-b5be-32a21b8113e9-horizon-secret-key\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.496085 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-config-data\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.496105 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-config-data\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.496128 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-combined-ca-bundle\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.496177 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjstk\" (UniqueName: \"kubernetes.io/projected/49c8a1c0-2155-4d68-971a-e68aff9e5133-kube-api-access-xjstk\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.497414 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-vz6qx"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.497862 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13948906-431d-4990-b5be-32a21b8113e9-logs\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.497926 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/083da0b8-38d6-4eab-b211-8389df97a0a8-etc-machine-id\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.504339 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-scripts\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.506455 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-config-data\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.514828 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-db-sync-config-data\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.523336 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hczs\" (UniqueName: \"kubernetes.io/projected/083da0b8-38d6-4eab-b211-8389df97a0a8-kube-api-access-9hczs\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.528479 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-config-data\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.530254 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-combined-ca-bundle\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.531828 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/13948906-431d-4990-b5be-32a21b8113e9-horizon-secret-key\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.532788 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-scripts\") pod \"cinder-db-sync-xgp99\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.534419 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mtz29"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.535395 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnp7w\" (UniqueName: \"kubernetes.io/projected/13948906-431d-4990-b5be-32a21b8113e9-kube-api-access-jnp7w\") pod \"horizon-7965d89547-s58k6\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.552392 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f84b6b857-6zgx9"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.581402 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.583517 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.586630 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.588108 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vfpx8" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.588315 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.588531 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.600208 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-combined-ca-bundle\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601017 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601072 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-db-sync-config-data\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601127 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601162 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-config-data\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601198 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c8a1c0-2155-4d68-971a-e68aff9e5133-logs\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601229 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5bc\" (UniqueName: \"kubernetes.io/projected/d9550585-5ddb-45d1-9471-884d030282fb-kube-api-access-8g5bc\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601261 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-config-data\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601290 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-config-data\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601360 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvztq\" (UniqueName: \"kubernetes.io/projected/e3b50139-f7ff-477a-8f59-5b72e0413206-kube-api-access-rvztq\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601415 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-scripts\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601444 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-run-httpd\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601474 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b50139-f7ff-477a-8f59-5b72e0413206-horizon-secret-key\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601518 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601637 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601675 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjstk\" (UniqueName: \"kubernetes.io/projected/49c8a1c0-2155-4d68-971a-e68aff9e5133-kube-api-access-xjstk\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601701 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601725 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-log-httpd\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601753 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-config\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601790 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-combined-ca-bundle\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601925 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601955 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-scripts\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.601979 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlljj\" (UniqueName: \"kubernetes.io/projected/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-kube-api-access-vlljj\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.602002 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b50139-f7ff-477a-8f59-5b72e0413206-logs\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.602068 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmgb5\" (UniqueName: \"kubernetes.io/projected/bb094f7c-1527-476d-bf4a-d54a022320d0-kube-api-access-jmgb5\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.602143 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-scripts\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.604612 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c8a1c0-2155-4d68-971a-e68aff9e5133-logs\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.605008 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-run-httpd\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.606085 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-log-httpd\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.608166 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-combined-ca-bundle\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.608689 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgp99" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.608959 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-config-data\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.609374 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-scripts\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.611575 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.622508 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.626234 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-scripts\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.628736 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.637397 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-config-data\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.645514 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjstk\" (UniqueName: \"kubernetes.io/projected/49c8a1c0-2155-4d68-971a-e68aff9e5133-kube-api-access-xjstk\") pod \"placement-db-sync-xh58j\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.645511 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlljj\" (UniqueName: \"kubernetes.io/projected/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-kube-api-access-vlljj\") pod \"ceilometer-0\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.678482 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh58j" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703527 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703589 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703635 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-config\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703692 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-combined-ca-bundle\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703710 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703728 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b50139-f7ff-477a-8f59-5b72e0413206-logs\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703747 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703787 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmgb5\" (UniqueName: \"kubernetes.io/projected/bb094f7c-1527-476d-bf4a-d54a022320d0-kube-api-access-jmgb5\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703832 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703869 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703890 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-db-sync-config-data\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703913 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703930 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-config-data\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703954 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5bc\" (UniqueName: \"kubernetes.io/projected/d9550585-5ddb-45d1-9471-884d030282fb-kube-api-access-8g5bc\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703978 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.703999 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.704019 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-logs\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.704072 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvztq\" (UniqueName: \"kubernetes.io/projected/e3b50139-f7ff-477a-8f59-5b72e0413206-kube-api-access-rvztq\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.704140 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-scripts\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.704161 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b50139-f7ff-477a-8f59-5b72e0413206-horizon-secret-key\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.704183 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45l6g\" (UniqueName: \"kubernetes.io/projected/240e8184-1e6d-4b28-bd27-80bb8e200f3f-kube-api-access-45l6g\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.704205 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.705130 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.705396 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.706889 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-config\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.707206 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.707582 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-scripts\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.708296 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-config-data\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.708732 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.708978 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b50139-f7ff-477a-8f59-5b72e0413206-logs\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.721242 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-db-sync-config-data\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.731115 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b50139-f7ff-477a-8f59-5b72e0413206-horizon-secret-key\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.734226 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-combined-ca-bundle\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.734973 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5bc\" (UniqueName: \"kubernetes.io/projected/d9550585-5ddb-45d1-9471-884d030282fb-kube-api-access-8g5bc\") pod \"dnsmasq-dns-785d8bcb8c-vz6qx\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.735683 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvztq\" (UniqueName: \"kubernetes.io/projected/e3b50139-f7ff-477a-8f59-5b72e0413206-kube-api-access-rvztq\") pod \"horizon-5f84b6b857-6zgx9\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.737618 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmgb5\" (UniqueName: \"kubernetes.io/projected/bb094f7c-1527-476d-bf4a-d54a022320d0-kube-api-access-jmgb5\") pod \"barbican-db-sync-mtz29\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.759704 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.794640 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807736 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45l6g\" (UniqueName: \"kubernetes.io/projected/240e8184-1e6d-4b28-bd27-80bb8e200f3f-kube-api-access-45l6g\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807810 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807855 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807904 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807928 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807971 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.807996 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.808014 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-logs\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.808835 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-logs\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.809680 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.812636 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mtz29" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.812649 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.812898 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.814500 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.818837 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.820928 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.853184 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wx8lz"] Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.853643 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.863325 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45l6g\" (UniqueName: \"kubernetes.io/projected/240e8184-1e6d-4b28-bd27-80bb8e200f3f-kube-api-access-45l6g\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:16 crc kubenswrapper[4846]: I1122 09:31:16.894900 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " pod="openstack/glance-default-external-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.022794 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.024279 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.155566 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9m5n9"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.167442 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.173598 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.178583 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.178954 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.209431 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.220674 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmnr\" (UniqueName: \"kubernetes.io/projected/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-kube-api-access-vmmnr\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.220772 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.220807 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.220903 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.221100 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.221127 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-logs\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.221180 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.221297 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.259507 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7tlsb"] Nov 22 09:31:17 crc kubenswrapper[4846]: W1122 09:31:17.286350 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod584aeb0f_b1a9_4a6e_b129_b21593065b18.slice/crio-01cc4512fd58cbfe50ee535a46de94c62ee10d68ebc52f87bd58f3efac96ad36 WatchSource:0}: Error finding container 01cc4512fd58cbfe50ee535a46de94c62ee10d68ebc52f87bd58f3efac96ad36: Status 404 returned error can't find the container with id 01cc4512fd58cbfe50ee535a46de94c62ee10d68ebc52f87bd58f3efac96ad36 Nov 22 09:31:17 crc kubenswrapper[4846]: W1122 09:31:17.289100 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87d707ea_60ff_4f96_acc8_b99fdd56cf03.slice/crio-11fcb659ad3a881978e1fafd0587d6d2859ca722b6c7be837a2259c808038260 WatchSource:0}: Error finding container 11fcb659ad3a881978e1fafd0587d6d2859ca722b6c7be837a2259c808038260: Status 404 returned error can't find the container with id 11fcb659ad3a881978e1fafd0587d6d2859ca722b6c7be837a2259c808038260 Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.324890 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmnr\" (UniqueName: \"kubernetes.io/projected/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-kube-api-access-vmmnr\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325360 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325385 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325443 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325515 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325545 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-logs\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325570 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.325626 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.327898 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-logs\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.327958 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.328194 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.337028 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.339291 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.341268 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.346702 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmnr\" (UniqueName: \"kubernetes.io/projected/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-kube-api-access-vmmnr\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.354063 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.440996 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xh58j"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.467281 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.577225 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh58j" event={"ID":"49c8a1c0-2155-4d68-971a-e68aff9e5133","Type":"ContainerStarted","Data":"72efe662024f5da8f71db0a78c50ddd0d820860a7dc56628e27bf29ef8953527"} Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.581709 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx8lz" event={"ID":"428b3f4b-ed9c-4e57-b374-65ec61230d39","Type":"ContainerStarted","Data":"a77fa3ccd8546ff6ea50bfc2ea6eaf03c6ea40aaf204287508fdb14c1f97d492"} Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.590697 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" event={"ID":"87d707ea-60ff-4f96-acc8-b99fdd56cf03","Type":"ContainerStarted","Data":"11fcb659ad3a881978e1fafd0587d6d2859ca722b6c7be837a2259c808038260"} Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.593494 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9m5n9" event={"ID":"584aeb0f-b1a9-4a6e-b129-b21593065b18","Type":"ContainerStarted","Data":"01cc4512fd58cbfe50ee535a46de94c62ee10d68ebc52f87bd58f3efac96ad36"} Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.593736 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" podUID="11494284-5130-4203-8185-91958a668040" containerName="dnsmasq-dns" containerID="cri-o://373fe77124db74b63854d6baa02e4f8690f2baae857975c03503918f4d0899fa" gracePeriod=10 Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.605273 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.749956 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xgp99"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.775396 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-vz6qx"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.785353 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.868488 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mtz29"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.877519 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7965d89547-s58k6"] Nov 22 09:31:17 crc kubenswrapper[4846]: I1122 09:31:17.976897 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f84b6b857-6zgx9"] Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.167087 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.398216 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.613956 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f84b6b857-6zgx9" event={"ID":"e3b50139-f7ff-477a-8f59-5b72e0413206","Type":"ContainerStarted","Data":"e7e0678a96ff22e505933794b540750776cee87fd6c43fc65e541fd0a3eb8982"} Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.624987 4846 generic.go:334] "Generic (PLEG): container finished" podID="87d707ea-60ff-4f96-acc8-b99fdd56cf03" containerID="85d0d3a45199908a0ef519bcb0f47bc214d0d70e1b47cd3e880df21c18bda6b4" exitCode=0 Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.625425 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" event={"ID":"87d707ea-60ff-4f96-acc8-b99fdd56cf03","Type":"ContainerDied","Data":"85d0d3a45199908a0ef519bcb0f47bc214d0d70e1b47cd3e880df21c18bda6b4"} Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.660593 4846 generic.go:334] "Generic (PLEG): container finished" podID="11494284-5130-4203-8185-91958a668040" containerID="373fe77124db74b63854d6baa02e4f8690f2baae857975c03503918f4d0899fa" exitCode=0 Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.661179 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" event={"ID":"11494284-5130-4203-8185-91958a668040","Type":"ContainerDied","Data":"373fe77124db74b63854d6baa02e4f8690f2baae857975c03503918f4d0899fa"} Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.760418 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9m5n9" event={"ID":"584aeb0f-b1a9-4a6e-b129-b21593065b18","Type":"ContainerStarted","Data":"3750c0f79fd15b824c4a981dda2963917b77fa7732dee70a171168e51b897084"} Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.825443 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a73a3301-2360-46f7-ae88-9dd9c93f0fb7","Type":"ContainerStarted","Data":"9d8f872c50b3cb9fc9b7f17c045000b84647c4c705cfc9aab91e5f8ba797adc3"} Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.825756 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.826163 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9m5n9" podStartSLOduration=3.82614498 podStartE2EDuration="3.82614498s" podCreationTimestamp="2025-11-22 09:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:18.813091418 +0000 UTC m=+1053.748781087" watchObservedRunningTime="2025-11-22 09:31:18.82614498 +0000 UTC m=+1053.761834629" Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.892828 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-config\") pod \"11494284-5130-4203-8185-91958a668040\" (UID: \"11494284-5130-4203-8185-91958a668040\") " Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.892878 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-nb\") pod \"11494284-5130-4203-8185-91958a668040\" (UID: \"11494284-5130-4203-8185-91958a668040\") " Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.892936 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-swift-storage-0\") pod \"11494284-5130-4203-8185-91958a668040\" (UID: \"11494284-5130-4203-8185-91958a668040\") " Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.892991 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-svc\") pod \"11494284-5130-4203-8185-91958a668040\" (UID: \"11494284-5130-4203-8185-91958a668040\") " Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.893084 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-sb\") pod \"11494284-5130-4203-8185-91958a668040\" (UID: \"11494284-5130-4203-8185-91958a668040\") " Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.893120 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9d9f\" (UniqueName: \"kubernetes.io/projected/11494284-5130-4203-8185-91958a668040-kube-api-access-d9d9f\") pod \"11494284-5130-4203-8185-91958a668040\" (UID: \"11494284-5130-4203-8185-91958a668040\") " Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.975902 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11494284-5130-4203-8185-91958a668040-kube-api-access-d9d9f" (OuterVolumeSpecName: "kube-api-access-d9d9f") pod "11494284-5130-4203-8185-91958a668040" (UID: "11494284-5130-4203-8185-91958a668040"). InnerVolumeSpecName "kube-api-access-d9d9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.984423 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7965d89547-s58k6"] Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.991345 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mtz29" event={"ID":"bb094f7c-1527-476d-bf4a-d54a022320d0","Type":"ContainerStarted","Data":"001fc2524290ea1fccdfdb136498deccdb2c92898c7c0ff0479e17908071df43"} Nov 22 09:31:18 crc kubenswrapper[4846]: I1122 09:31:18.996252 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9d9f\" (UniqueName: \"kubernetes.io/projected/11494284-5130-4203-8185-91958a668040-kube-api-access-d9d9f\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.030164 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx8lz" event={"ID":"428b3f4b-ed9c-4e57-b374-65ec61230d39","Type":"ContainerStarted","Data":"bdb46927f8a6f6c0c5bef27f83b8cbbe2bf27a096578db3a382af5a48f60fd4a"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.079682 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58dd4457b9-h4l2s"] Nov 22 09:31:19 crc kubenswrapper[4846]: E1122 09:31:19.080322 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11494284-5130-4203-8185-91958a668040" containerName="dnsmasq-dns" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.080344 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="11494284-5130-4203-8185-91958a668040" containerName="dnsmasq-dns" Nov 22 09:31:19 crc kubenswrapper[4846]: E1122 09:31:19.080374 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11494284-5130-4203-8185-91958a668040" containerName="init" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.080382 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="11494284-5130-4203-8185-91958a668040" containerName="init" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.080596 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="11494284-5130-4203-8185-91958a668040" containerName="dnsmasq-dns" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.081925 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.087318 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"240e8184-1e6d-4b28-bd27-80bb8e200f3f","Type":"ContainerStarted","Data":"8088ba84eae825179e4280c71aba2abf277aaa251baef183a9dbf4e12ea4e64d"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.099326 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a41b464c-76bd-4527-82a2-69408a024f68-horizon-secret-key\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.099390 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-scripts\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.099415 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-config-data\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.099447 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41b464c-76bd-4527-82a2-69408a024f68-logs\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.099480 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr6lz\" (UniqueName: \"kubernetes.io/projected/a41b464c-76bd-4527-82a2-69408a024f68-kube-api-access-qr6lz\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.103099 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgp99" event={"ID":"083da0b8-38d6-4eab-b211-8389df97a0a8","Type":"ContainerStarted","Data":"a97d32c8e6aa0d4cd658af678eaddc8fda63ec72ff34d23471f5974587c31c09"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.133763 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerStarted","Data":"2597f58413f7b206c485f5b1a293f17ab41e1a50b2325ee7aeb2aa3474fbeecb"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.147178 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11494284-5130-4203-8185-91958a668040" (UID: "11494284-5130-4203-8185-91958a668040"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.170759 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "11494284-5130-4203-8185-91958a668040" (UID: "11494284-5130-4203-8185-91958a668040"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.173273 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-config" (OuterVolumeSpecName: "config") pod "11494284-5130-4203-8185-91958a668040" (UID: "11494284-5130-4203-8185-91958a668040"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.170516 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7965d89547-s58k6" event={"ID":"13948906-431d-4990-b5be-32a21b8113e9","Type":"ContainerStarted","Data":"37c3a8e016eeb97594fde6775f1c2c8bed344a4da524dfbfc3c8fd4017521b28"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.178301 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.178595 4846 generic.go:334] "Generic (PLEG): container finished" podID="d9550585-5ddb-45d1-9471-884d030282fb" containerID="83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6" exitCode=0 Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.178651 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" event={"ID":"d9550585-5ddb-45d1-9471-884d030282fb","Type":"ContainerDied","Data":"83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.178689 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" event={"ID":"d9550585-5ddb-45d1-9471-884d030282fb","Type":"ContainerStarted","Data":"ac7974430f1484daca8518d50c9eef6743e840eb437a67b5e89a91dc7aa91c6c"} Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.189070 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "11494284-5130-4203-8185-91958a668040" (UID: "11494284-5130-4203-8185-91958a668040"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201449 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr6lz\" (UniqueName: \"kubernetes.io/projected/a41b464c-76bd-4527-82a2-69408a024f68-kube-api-access-qr6lz\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201639 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a41b464c-76bd-4527-82a2-69408a024f68-horizon-secret-key\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201696 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-scripts\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201733 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-config-data\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201773 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41b464c-76bd-4527-82a2-69408a024f68-logs\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201888 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201962 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.201983 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.202030 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.202529 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41b464c-76bd-4527-82a2-69408a024f68-logs\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.220885 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-scripts\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.221995 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-config-data\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.227795 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58dd4457b9-h4l2s"] Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.236617 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a41b464c-76bd-4527-82a2-69408a024f68-horizon-secret-key\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.302713 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.339552 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.354383 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wx8lz" podStartSLOduration=4.354355495 podStartE2EDuration="4.354355495s" podCreationTimestamp="2025-11-22 09:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:19.092087422 +0000 UTC m=+1054.027777101" watchObservedRunningTime="2025-11-22 09:31:19.354355495 +0000 UTC m=+1054.290045154" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.534273 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr6lz\" (UniqueName: \"kubernetes.io/projected/a41b464c-76bd-4527-82a2-69408a024f68-kube-api-access-qr6lz\") pod \"horizon-58dd4457b9-h4l2s\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.577578 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11494284-5130-4203-8185-91958a668040" (UID: "11494284-5130-4203-8185-91958a668040"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.620593 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11494284-5130-4203-8185-91958a668040-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.733234 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.750010 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.825168 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-config\") pod \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.825276 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-swift-storage-0\") pod \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.825341 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcwws\" (UniqueName: \"kubernetes.io/projected/87d707ea-60ff-4f96-acc8-b99fdd56cf03-kube-api-access-fcwws\") pod \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.825415 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-nb\") pod \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.825436 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-sb\") pod \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.825742 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-svc\") pod \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\" (UID: \"87d707ea-60ff-4f96-acc8-b99fdd56cf03\") " Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.854300 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d707ea-60ff-4f96-acc8-b99fdd56cf03-kube-api-access-fcwws" (OuterVolumeSpecName: "kube-api-access-fcwws") pod "87d707ea-60ff-4f96-acc8-b99fdd56cf03" (UID: "87d707ea-60ff-4f96-acc8-b99fdd56cf03"). InnerVolumeSpecName "kube-api-access-fcwws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.927751 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcwws\" (UniqueName: \"kubernetes.io/projected/87d707ea-60ff-4f96-acc8-b99fdd56cf03-kube-api-access-fcwws\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.944559 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87d707ea-60ff-4f96-acc8-b99fdd56cf03" (UID: "87d707ea-60ff-4f96-acc8-b99fdd56cf03"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.945137 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-config" (OuterVolumeSpecName: "config") pod "87d707ea-60ff-4f96-acc8-b99fdd56cf03" (UID: "87d707ea-60ff-4f96-acc8-b99fdd56cf03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.946011 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87d707ea-60ff-4f96-acc8-b99fdd56cf03" (UID: "87d707ea-60ff-4f96-acc8-b99fdd56cf03"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.949250 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87d707ea-60ff-4f96-acc8-b99fdd56cf03" (UID: "87d707ea-60ff-4f96-acc8-b99fdd56cf03"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:19 crc kubenswrapper[4846]: I1122 09:31:19.952248 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87d707ea-60ff-4f96-acc8-b99fdd56cf03" (UID: "87d707ea-60ff-4f96-acc8-b99fdd56cf03"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.028933 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.028978 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.028998 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.029008 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.029022 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87d707ea-60ff-4f96-acc8-b99fdd56cf03-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.256435 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" event={"ID":"d9550585-5ddb-45d1-9471-884d030282fb","Type":"ContainerStarted","Data":"a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99"} Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.256965 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.265879 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"240e8184-1e6d-4b28-bd27-80bb8e200f3f","Type":"ContainerStarted","Data":"63c5c12d1a774b22a79999356df8ad8839fd11ade0d6a82ffd1cfbafab8c0d62"} Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.271717 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.271739 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-7tlsb" event={"ID":"87d707ea-60ff-4f96-acc8-b99fdd56cf03","Type":"ContainerDied","Data":"11fcb659ad3a881978e1fafd0587d6d2859ca722b6c7be837a2259c808038260"} Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.271803 4846 scope.go:117] "RemoveContainer" containerID="85d0d3a45199908a0ef519bcb0f47bc214d0d70e1b47cd3e880df21c18bda6b4" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.285836 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" event={"ID":"11494284-5130-4203-8185-91958a668040","Type":"ContainerDied","Data":"4cec489f4bf2119e2ea3cffaaae0b207d787782d081799d0c5bf3f5145149d01"} Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.286544 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gggcj" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.297113 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" podStartSLOduration=4.297065204 podStartE2EDuration="4.297065204s" podCreationTimestamp="2025-11-22 09:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:20.281004393 +0000 UTC m=+1055.216694042" watchObservedRunningTime="2025-11-22 09:31:20.297065204 +0000 UTC m=+1055.232754873" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.318465 4846 scope.go:117] "RemoveContainer" containerID="373fe77124db74b63854d6baa02e4f8690f2baae857975c03503918f4d0899fa" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.360184 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7tlsb"] Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.368114 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-7tlsb"] Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.374910 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gggcj"] Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.381123 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gggcj"] Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.400961 4846 scope.go:117] "RemoveContainer" containerID="4788be4bd23f1a2c1c823f2a3357dc694cc33fc06e02a41e80d0c016477bf2c0" Nov 22 09:31:20 crc kubenswrapper[4846]: I1122 09:31:20.612788 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58dd4457b9-h4l2s"] Nov 22 09:31:21 crc kubenswrapper[4846]: I1122 09:31:21.306987 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dd4457b9-h4l2s" event={"ID":"a41b464c-76bd-4527-82a2-69408a024f68","Type":"ContainerStarted","Data":"aae8805d76321b087277f8211b4f7e995c87db67d07b73f68d98f3fec505bc2b"} Nov 22 09:31:21 crc kubenswrapper[4846]: I1122 09:31:21.323965 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a73a3301-2360-46f7-ae88-9dd9c93f0fb7","Type":"ContainerStarted","Data":"926321a8d9f82c5a8e6a20ca747338af5847a81a165c77d1c2fe39ecfb14d552"} Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.050928 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11494284-5130-4203-8185-91958a668040" path="/var/lib/kubelet/pods/11494284-5130-4203-8185-91958a668040/volumes" Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.052747 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d707ea-60ff-4f96-acc8-b99fdd56cf03" path="/var/lib/kubelet/pods/87d707ea-60ff-4f96-acc8-b99fdd56cf03/volumes" Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.342624 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"240e8184-1e6d-4b28-bd27-80bb8e200f3f","Type":"ContainerStarted","Data":"665df5d14154c563001d5a6218151aab193caee6a4e0dd2fe8318c4e9678fb00"} Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.343287 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-log" containerID="cri-o://63c5c12d1a774b22a79999356df8ad8839fd11ade0d6a82ffd1cfbafab8c0d62" gracePeriod=30 Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.344093 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-httpd" containerID="cri-o://665df5d14154c563001d5a6218151aab193caee6a4e0dd2fe8318c4e9678fb00" gracePeriod=30 Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.351482 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a73a3301-2360-46f7-ae88-9dd9c93f0fb7","Type":"ContainerStarted","Data":"c880279f4cb9bfa0f077b7d362584dd8ee5b7569978ec9cfc14ae595475448ed"} Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.351751 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-log" containerID="cri-o://926321a8d9f82c5a8e6a20ca747338af5847a81a165c77d1c2fe39ecfb14d552" gracePeriod=30 Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.351923 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-httpd" containerID="cri-o://c880279f4cb9bfa0f077b7d362584dd8ee5b7569978ec9cfc14ae595475448ed" gracePeriod=30 Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.373774 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.373742073 podStartE2EDuration="6.373742073s" podCreationTimestamp="2025-11-22 09:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:22.369834339 +0000 UTC m=+1057.305523998" watchObservedRunningTime="2025-11-22 09:31:22.373742073 +0000 UTC m=+1057.309431722" Nov 22 09:31:22 crc kubenswrapper[4846]: I1122 09:31:22.410931 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.410906512 podStartE2EDuration="6.410906512s" podCreationTimestamp="2025-11-22 09:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:31:22.403686491 +0000 UTC m=+1057.339376140" watchObservedRunningTime="2025-11-22 09:31:22.410906512 +0000 UTC m=+1057.346596161" Nov 22 09:31:23 crc kubenswrapper[4846]: I1122 09:31:23.366106 4846 generic.go:334] "Generic (PLEG): container finished" podID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerID="926321a8d9f82c5a8e6a20ca747338af5847a81a165c77d1c2fe39ecfb14d552" exitCode=143 Nov 22 09:31:23 crc kubenswrapper[4846]: I1122 09:31:23.366186 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a73a3301-2360-46f7-ae88-9dd9c93f0fb7","Type":"ContainerDied","Data":"926321a8d9f82c5a8e6a20ca747338af5847a81a165c77d1c2fe39ecfb14d552"} Nov 22 09:31:23 crc kubenswrapper[4846]: I1122 09:31:23.370607 4846 generic.go:334] "Generic (PLEG): container finished" podID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerID="665df5d14154c563001d5a6218151aab193caee6a4e0dd2fe8318c4e9678fb00" exitCode=0 Nov 22 09:31:23 crc kubenswrapper[4846]: I1122 09:31:23.370723 4846 generic.go:334] "Generic (PLEG): container finished" podID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerID="63c5c12d1a774b22a79999356df8ad8839fd11ade0d6a82ffd1cfbafab8c0d62" exitCode=143 Nov 22 09:31:23 crc kubenswrapper[4846]: I1122 09:31:23.370845 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"240e8184-1e6d-4b28-bd27-80bb8e200f3f","Type":"ContainerDied","Data":"665df5d14154c563001d5a6218151aab193caee6a4e0dd2fe8318c4e9678fb00"} Nov 22 09:31:23 crc kubenswrapper[4846]: I1122 09:31:23.370870 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"240e8184-1e6d-4b28-bd27-80bb8e200f3f","Type":"ContainerDied","Data":"63c5c12d1a774b22a79999356df8ad8839fd11ade0d6a82ffd1cfbafab8c0d62"} Nov 22 09:31:24 crc kubenswrapper[4846]: I1122 09:31:24.386692 4846 generic.go:334] "Generic (PLEG): container finished" podID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerID="c880279f4cb9bfa0f077b7d362584dd8ee5b7569978ec9cfc14ae595475448ed" exitCode=0 Nov 22 09:31:24 crc kubenswrapper[4846]: I1122 09:31:24.386765 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a73a3301-2360-46f7-ae88-9dd9c93f0fb7","Type":"ContainerDied","Data":"c880279f4cb9bfa0f077b7d362584dd8ee5b7569978ec9cfc14ae595475448ed"} Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.452259 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f84b6b857-6zgx9"] Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.496005 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8564f79874-c88vw"] Nov 22 09:31:25 crc kubenswrapper[4846]: E1122 09:31:25.497049 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d707ea-60ff-4f96-acc8-b99fdd56cf03" containerName="init" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.497089 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d707ea-60ff-4f96-acc8-b99fdd56cf03" containerName="init" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.497317 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d707ea-60ff-4f96-acc8-b99fdd56cf03" containerName="init" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.503670 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.507299 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.528514 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8564f79874-c88vw"] Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623649 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-config-data\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623732 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-tls-certs\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623783 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-secret-key\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623835 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-scripts\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623915 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79042af-3413-4614-a787-72fdd7fc91d7-logs\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623950 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-combined-ca-bundle\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.623973 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfhz\" (UniqueName: \"kubernetes.io/projected/f79042af-3413-4614-a787-72fdd7fc91d7-kube-api-access-scfhz\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.679109 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58dd4457b9-h4l2s"] Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.688067 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dfd5ccb4b-fpl7v"] Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.690115 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.696322 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dfd5ccb4b-fpl7v"] Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.726582 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-combined-ca-bundle\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.726984 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfhz\" (UniqueName: \"kubernetes.io/projected/f79042af-3413-4614-a787-72fdd7fc91d7-kube-api-access-scfhz\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.727202 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-config-data\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.727304 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-tls-certs\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.727395 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-secret-key\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.727500 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-scripts\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.727623 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79042af-3413-4614-a787-72fdd7fc91d7-logs\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.728276 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79042af-3413-4614-a787-72fdd7fc91d7-logs\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.730438 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-scripts\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.730610 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-config-data\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.736234 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-tls-certs\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.736394 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-combined-ca-bundle\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.739772 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-secret-key\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.748639 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfhz\" (UniqueName: \"kubernetes.io/projected/f79042af-3413-4614-a787-72fdd7fc91d7-kube-api-access-scfhz\") pod \"horizon-8564f79874-c88vw\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.831454 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-config-data\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.831543 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-scripts\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.831568 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbz8f\" (UniqueName: \"kubernetes.io/projected/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-kube-api-access-tbz8f\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.831608 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-logs\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.831982 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-horizon-tls-certs\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.832193 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-horizon-secret-key\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.832227 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-combined-ca-bundle\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.897896 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.934724 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-horizon-tls-certs\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.934807 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-horizon-secret-key\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.934831 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-combined-ca-bundle\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.934917 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-config-data\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.934967 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-scripts\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.934993 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbz8f\" (UniqueName: \"kubernetes.io/projected/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-kube-api-access-tbz8f\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.935031 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-logs\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.936326 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-scripts\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.936726 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-logs\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.941773 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-config-data\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.941967 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-horizon-secret-key\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.943194 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-horizon-tls-certs\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.945845 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-combined-ca-bundle\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:25 crc kubenswrapper[4846]: I1122 09:31:25.958140 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbz8f\" (UniqueName: \"kubernetes.io/projected/76c862f1-2cb3-4598-9be8-f8ff8bbab6f3-kube-api-access-tbz8f\") pod \"horizon-5dfd5ccb4b-fpl7v\" (UID: \"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3\") " pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:26 crc kubenswrapper[4846]: I1122 09:31:26.023900 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:31:26 crc kubenswrapper[4846]: I1122 09:31:26.796326 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:31:26 crc kubenswrapper[4846]: I1122 09:31:26.878000 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jfr65"] Nov 22 09:31:26 crc kubenswrapper[4846]: I1122 09:31:26.878349 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" containerID="cri-o://e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d" gracePeriod=10 Nov 22 09:31:31 crc kubenswrapper[4846]: I1122 09:31:31.469351 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 22 09:31:31 crc kubenswrapper[4846]: I1122 09:31:31.482403 4846 generic.go:334] "Generic (PLEG): container finished" podID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerID="e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d" exitCode=0 Nov 22 09:31:31 crc kubenswrapper[4846]: I1122 09:31:31.482467 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jfr65" event={"ID":"35af4ff6-1042-4f6d-93da-bfb4f43fd04d","Type":"ContainerDied","Data":"e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d"} Nov 22 09:31:33 crc kubenswrapper[4846]: E1122 09:31:33.179749 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 22 09:31:33 crc kubenswrapper[4846]: E1122 09:31:33.180791 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjstk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-xh58j_openstack(49c8a1c0-2155-4d68-971a-e68aff9e5133): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:31:33 crc kubenswrapper[4846]: E1122 09:31:33.182131 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-xh58j" podUID="49c8a1c0-2155-4d68-971a-e68aff9e5133" Nov 22 09:31:33 crc kubenswrapper[4846]: E1122 09:31:33.519989 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-xh58j" podUID="49c8a1c0-2155-4d68-971a-e68aff9e5133" Nov 22 09:31:34 crc kubenswrapper[4846]: I1122 09:31:34.528599 4846 generic.go:334] "Generic (PLEG): container finished" podID="428b3f4b-ed9c-4e57-b374-65ec61230d39" containerID="bdb46927f8a6f6c0c5bef27f83b8cbbe2bf27a096578db3a382af5a48f60fd4a" exitCode=0 Nov 22 09:31:34 crc kubenswrapper[4846]: I1122 09:31:34.529007 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx8lz" event={"ID":"428b3f4b-ed9c-4e57-b374-65ec61230d39","Type":"ContainerDied","Data":"bdb46927f8a6f6c0c5bef27f83b8cbbe2bf27a096578db3a382af5a48f60fd4a"} Nov 22 09:31:36 crc kubenswrapper[4846]: I1122 09:31:36.470723 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 22 09:31:41 crc kubenswrapper[4846]: I1122 09:31:41.471729 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 22 09:31:41 crc kubenswrapper[4846]: I1122 09:31:41.472740 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:31:46 crc kubenswrapper[4846]: I1122 09:31:46.469945 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 22 09:31:47 crc kubenswrapper[4846]: I1122 09:31:47.024824 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:31:47 crc kubenswrapper[4846]: I1122 09:31:47.024892 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:31:47 crc kubenswrapper[4846]: I1122 09:31:47.606819 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:31:47 crc kubenswrapper[4846]: I1122 09:31:47.606907 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:31:51 crc kubenswrapper[4846]: E1122 09:31:51.001806 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 22 09:31:51 crc kubenswrapper[4846]: E1122 09:31:51.003181 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbbh5c5h59fh57dh5dfhcch564h587h58h95h66h5d4h7fh9fh9dhc4h79h666h554hb9h64dh5dhd6h59fh549hb5hddh5f7h58bh9bh584h95q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rvztq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5f84b6b857-6zgx9_openstack(e3b50139-f7ff-477a-8f59-5b72e0413206): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:31:51 crc kubenswrapper[4846]: E1122 09:31:51.017524 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5f84b6b857-6zgx9" podUID="e3b50139-f7ff-477a-8f59-5b72e0413206" Nov 22 09:31:56 crc kubenswrapper[4846]: I1122 09:31:56.470359 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.391431 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.392399 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b4h5c8h599h568h7bh58ch668h55fh55chddhdch65ch667h67dh78h64h698h597hdbh68dh55h8fhcbh66dh584h547h5dch646h587h8ch5f4h55dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qr6lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-58dd4457b9-h4l2s_openstack(a41b464c-76bd-4527-82a2-69408a024f68): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.393320 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.393535 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n667h58h86hd5h5fbh5b9h668h5c9h565h576h59h585h59dh66bh656h7dhfbh559h64bh677hcch77h5dbh55bh74h58dh568h97h654h5c9hbbh589q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnp7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7965d89547-s58k6_openstack(13948906-431d-4990-b5be-32a21b8113e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.394594 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-58dd4457b9-h4l2s" podUID="a41b464c-76bd-4527-82a2-69408a024f68" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.395249 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7965d89547-s58k6" podUID="13948906-431d-4990-b5be-32a21b8113e9" Nov 22 09:32:01 crc kubenswrapper[4846]: I1122 09:32:01.472370 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.723059 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 22 09:32:01 crc kubenswrapper[4846]: E1122 09:32:01.723286 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n645h5dbh68bh4hbchf7h66dh58fh5c5h68dh5b5h64h4h5b9h665h56bh5bch67dh5d5hddh89h77h5bch58fh5cdh59fh657h65dhc6h688h554h66q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlljj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(bff9dfbc-88dc-4ecc-95f3-4eac40350d97): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.205957 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.349138 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-scripts\") pod \"428b3f4b-ed9c-4e57-b374-65ec61230d39\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.349262 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-fernet-keys\") pod \"428b3f4b-ed9c-4e57-b374-65ec61230d39\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.349297 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-combined-ca-bundle\") pod \"428b3f4b-ed9c-4e57-b374-65ec61230d39\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.349411 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-config-data\") pod \"428b3f4b-ed9c-4e57-b374-65ec61230d39\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.349440 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m4l5\" (UniqueName: \"kubernetes.io/projected/428b3f4b-ed9c-4e57-b374-65ec61230d39-kube-api-access-2m4l5\") pod \"428b3f4b-ed9c-4e57-b374-65ec61230d39\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.349504 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-credential-keys\") pod \"428b3f4b-ed9c-4e57-b374-65ec61230d39\" (UID: \"428b3f4b-ed9c-4e57-b374-65ec61230d39\") " Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.355763 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/428b3f4b-ed9c-4e57-b374-65ec61230d39-kube-api-access-2m4l5" (OuterVolumeSpecName: "kube-api-access-2m4l5") pod "428b3f4b-ed9c-4e57-b374-65ec61230d39" (UID: "428b3f4b-ed9c-4e57-b374-65ec61230d39"). InnerVolumeSpecName "kube-api-access-2m4l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.355775 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-scripts" (OuterVolumeSpecName: "scripts") pod "428b3f4b-ed9c-4e57-b374-65ec61230d39" (UID: "428b3f4b-ed9c-4e57-b374-65ec61230d39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.360752 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "428b3f4b-ed9c-4e57-b374-65ec61230d39" (UID: "428b3f4b-ed9c-4e57-b374-65ec61230d39"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.376455 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "428b3f4b-ed9c-4e57-b374-65ec61230d39" (UID: "428b3f4b-ed9c-4e57-b374-65ec61230d39"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.379414 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "428b3f4b-ed9c-4e57-b374-65ec61230d39" (UID: "428b3f4b-ed9c-4e57-b374-65ec61230d39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.388243 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-config-data" (OuterVolumeSpecName: "config-data") pod "428b3f4b-ed9c-4e57-b374-65ec61230d39" (UID: "428b3f4b-ed9c-4e57-b374-65ec61230d39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.451699 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.451763 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m4l5\" (UniqueName: \"kubernetes.io/projected/428b3f4b-ed9c-4e57-b374-65ec61230d39-kube-api-access-2m4l5\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.451782 4846 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.451794 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.451804 4846 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.451815 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428b3f4b-ed9c-4e57-b374-65ec61230d39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.885931 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wx8lz" event={"ID":"428b3f4b-ed9c-4e57-b374-65ec61230d39","Type":"ContainerDied","Data":"a77fa3ccd8546ff6ea50bfc2ea6eaf03c6ea40aaf204287508fdb14c1f97d492"} Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.886495 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77fa3ccd8546ff6ea50bfc2ea6eaf03c6ea40aaf204287508fdb14c1f97d492" Nov 22 09:32:02 crc kubenswrapper[4846]: I1122 09:32:02.886072 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wx8lz" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.324567 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wx8lz"] Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.332037 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wx8lz"] Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.418142 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wjc4l"] Nov 22 09:32:03 crc kubenswrapper[4846]: E1122 09:32:03.418651 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="428b3f4b-ed9c-4e57-b374-65ec61230d39" containerName="keystone-bootstrap" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.418669 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="428b3f4b-ed9c-4e57-b374-65ec61230d39" containerName="keystone-bootstrap" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.418859 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="428b3f4b-ed9c-4e57-b374-65ec61230d39" containerName="keystone-bootstrap" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.419605 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.429898 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.430208 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.430414 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.431035 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d54gt" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.431219 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.432431 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wjc4l"] Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.487923 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-scripts\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.488271 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-credential-keys\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.488332 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-combined-ca-bundle\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.488371 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s84cw\" (UniqueName: \"kubernetes.io/projected/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-kube-api-access-s84cw\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.488463 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-config-data\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.488554 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-fernet-keys\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.590701 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-credential-keys\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.590841 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-combined-ca-bundle\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.590901 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s84cw\" (UniqueName: \"kubernetes.io/projected/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-kube-api-access-s84cw\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.591006 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-config-data\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.592153 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-fernet-keys\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.592261 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-scripts\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.597908 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-fernet-keys\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.598221 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-combined-ca-bundle\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.598233 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-config-data\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.599089 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-credential-keys\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.599191 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-scripts\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.610573 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s84cw\" (UniqueName: \"kubernetes.io/projected/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-kube-api-access-s84cw\") pod \"keystone-bootstrap-wjc4l\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: E1122 09:32:03.698797 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 22 09:32:03 crc kubenswrapper[4846]: E1122 09:32:03.699346 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hczs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xgp99_openstack(083da0b8-38d6-4eab-b211-8389df97a0a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:32:03 crc kubenswrapper[4846]: E1122 09:32:03.700689 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xgp99" podUID="083da0b8-38d6-4eab-b211-8389df97a0a8" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.741000 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.797114 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.804475 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.813092 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.816162 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899176 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-combined-ca-bundle\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899268 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-config\") pod \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899296 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvztq\" (UniqueName: \"kubernetes.io/projected/e3b50139-f7ff-477a-8f59-5b72e0413206-kube-api-access-rvztq\") pod \"e3b50139-f7ff-477a-8f59-5b72e0413206\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899327 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-sb\") pod \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899431 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-config-data\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899475 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-nb\") pod \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899515 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-config-data\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899543 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-logs\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899615 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-combined-ca-bundle\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899682 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b50139-f7ff-477a-8f59-5b72e0413206-horizon-secret-key\") pod \"e3b50139-f7ff-477a-8f59-5b72e0413206\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.899705 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b50139-f7ff-477a-8f59-5b72e0413206-logs\") pod \"e3b50139-f7ff-477a-8f59-5b72e0413206\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900123 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-logs\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900167 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-internal-tls-certs\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900282 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-config-data\") pod \"e3b50139-f7ff-477a-8f59-5b72e0413206\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900323 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45l6g\" (UniqueName: \"kubernetes.io/projected/240e8184-1e6d-4b28-bd27-80bb8e200f3f-kube-api-access-45l6g\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900349 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-dns-svc\") pod \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900379 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqtwv\" (UniqueName: \"kubernetes.io/projected/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-kube-api-access-mqtwv\") pod \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\" (UID: \"35af4ff6-1042-4f6d-93da-bfb4f43fd04d\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900439 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-public-tls-certs\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900457 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900476 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-scripts\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900503 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmnr\" (UniqueName: \"kubernetes.io/projected/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-kube-api-access-vmmnr\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900552 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-httpd-run\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900575 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-scripts\") pod \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\" (UID: \"240e8184-1e6d-4b28-bd27-80bb8e200f3f\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900618 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-httpd-run\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900639 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\" (UID: \"a73a3301-2360-46f7-ae88-9dd9c93f0fb7\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.900659 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-scripts\") pod \"e3b50139-f7ff-477a-8f59-5b72e0413206\" (UID: \"e3b50139-f7ff-477a-8f59-5b72e0413206\") " Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.901866 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.901897 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-scripts" (OuterVolumeSpecName: "scripts") pod "e3b50139-f7ff-477a-8f59-5b72e0413206" (UID: "e3b50139-f7ff-477a-8f59-5b72e0413206"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.902271 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-config-data" (OuterVolumeSpecName: "config-data") pod "e3b50139-f7ff-477a-8f59-5b72e0413206" (UID: "e3b50139-f7ff-477a-8f59-5b72e0413206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.903364 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b50139-f7ff-477a-8f59-5b72e0413206-logs" (OuterVolumeSpecName: "logs") pod "e3b50139-f7ff-477a-8f59-5b72e0413206" (UID: "e3b50139-f7ff-477a-8f59-5b72e0413206"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.903516 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-logs" (OuterVolumeSpecName: "logs") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.904183 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-logs" (OuterVolumeSpecName: "logs") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.913968 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.916645 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b50139-f7ff-477a-8f59-5b72e0413206-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e3b50139-f7ff-477a-8f59-5b72e0413206" (UID: "e3b50139-f7ff-477a-8f59-5b72e0413206"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.916906 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-kube-api-access-vmmnr" (OuterVolumeSpecName: "kube-api-access-vmmnr") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "kube-api-access-vmmnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.923552 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-scripts" (OuterVolumeSpecName: "scripts") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.923568 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-kube-api-access-mqtwv" (OuterVolumeSpecName: "kube-api-access-mqtwv") pod "35af4ff6-1042-4f6d-93da-bfb4f43fd04d" (UID: "35af4ff6-1042-4f6d-93da-bfb4f43fd04d"). InnerVolumeSpecName "kube-api-access-mqtwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.925660 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b50139-f7ff-477a-8f59-5b72e0413206-kube-api-access-rvztq" (OuterVolumeSpecName: "kube-api-access-rvztq") pod "e3b50139-f7ff-477a-8f59-5b72e0413206" (UID: "e3b50139-f7ff-477a-8f59-5b72e0413206"). InnerVolumeSpecName "kube-api-access-rvztq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.925700 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"240e8184-1e6d-4b28-bd27-80bb8e200f3f","Type":"ContainerDied","Data":"8088ba84eae825179e4280c71aba2abf277aaa251baef183a9dbf4e12ea4e64d"} Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.925984 4846 scope.go:117] "RemoveContainer" containerID="665df5d14154c563001d5a6218151aab193caee6a4e0dd2fe8318c4e9678fb00" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.925794 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.931549 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f84b6b857-6zgx9" event={"ID":"e3b50139-f7ff-477a-8f59-5b72e0413206","Type":"ContainerDied","Data":"e7e0678a96ff22e505933794b540750776cee87fd6c43fc65e541fd0a3eb8982"} Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.931825 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f84b6b857-6zgx9" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.938012 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240e8184-1e6d-4b28-bd27-80bb8e200f3f-kube-api-access-45l6g" (OuterVolumeSpecName: "kube-api-access-45l6g") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "kube-api-access-45l6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.939962 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-scripts" (OuterVolumeSpecName: "scripts") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.940883 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.941848 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jfr65" event={"ID":"35af4ff6-1042-4f6d-93da-bfb4f43fd04d","Type":"ContainerDied","Data":"e611e28932f348fc935c333fb1c09099ebe592f6773d0ecc51ba462b8889c067"} Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.941950 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jfr65" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.942532 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.952143 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a73a3301-2360-46f7-ae88-9dd9c93f0fb7","Type":"ContainerDied","Data":"9d8f872c50b3cb9fc9b7f17c045000b84647c4c705cfc9aab91e5f8ba797adc3"} Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.952192 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:03 crc kubenswrapper[4846]: E1122 09:32:03.954732 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-xgp99" podUID="083da0b8-38d6-4eab-b211-8389df97a0a8" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.964500 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.969954 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.993489 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-config-data" (OuterVolumeSpecName: "config-data") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.994848 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35af4ff6-1042-4f6d-93da-bfb4f43fd04d" (UID: "35af4ff6-1042-4f6d-93da-bfb4f43fd04d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.997101 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-config" (OuterVolumeSpecName: "config") pod "35af4ff6-1042-4f6d-93da-bfb4f43fd04d" (UID: "35af4ff6-1042-4f6d-93da-bfb4f43fd04d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:03 crc kubenswrapper[4846]: I1122 09:32:03.998298 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35af4ff6-1042-4f6d-93da-bfb4f43fd04d" (UID: "35af4ff6-1042-4f6d-93da-bfb4f43fd04d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004607 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004662 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004678 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004697 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004711 4846 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b50139-f7ff-477a-8f59-5b72e0413206-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004722 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b50139-f7ff-477a-8f59-5b72e0413206-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004734 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004749 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004765 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45l6g\" (UniqueName: \"kubernetes.io/projected/240e8184-1e6d-4b28-bd27-80bb8e200f3f-kube-api-access-45l6g\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004781 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004793 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqtwv\" (UniqueName: \"kubernetes.io/projected/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-kube-api-access-mqtwv\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004819 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004832 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004844 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmmnr\" (UniqueName: \"kubernetes.io/projected/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-kube-api-access-vmmnr\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004857 4846 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/240e8184-1e6d-4b28-bd27-80bb8e200f3f-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004868 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004917 4846 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004942 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004956 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b50139-f7ff-477a-8f59-5b72e0413206-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004968 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004980 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.004993 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvztq\" (UniqueName: \"kubernetes.io/projected/e3b50139-f7ff-477a-8f59-5b72e0413206-kube-api-access-rvztq\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.013158 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-config-data" (OuterVolumeSpecName: "config-data") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.016237 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "240e8184-1e6d-4b28-bd27-80bb8e200f3f" (UID: "240e8184-1e6d-4b28-bd27-80bb8e200f3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.019362 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a73a3301-2360-46f7-ae88-9dd9c93f0fb7" (UID: "a73a3301-2360-46f7-ae88-9dd9c93f0fb7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.019740 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35af4ff6-1042-4f6d-93da-bfb4f43fd04d" (UID: "35af4ff6-1042-4f6d-93da-bfb4f43fd04d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.034819 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.035890 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.047666 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="428b3f4b-ed9c-4e57-b374-65ec61230d39" path="/var/lib/kubelet/pods/428b3f4b-ed9c-4e57-b374-65ec61230d39/volumes" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.107637 4846 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/240e8184-1e6d-4b28-bd27-80bb8e200f3f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.107671 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.107686 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.107697 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35af4ff6-1042-4f6d-93da-bfb4f43fd04d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.107707 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.107718 4846 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a73a3301-2360-46f7-ae88-9dd9c93f0fb7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.114228 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f84b6b857-6zgx9"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.125174 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f84b6b857-6zgx9"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.257292 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.272848 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.297531 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jfr65"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.306576 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jfr65"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.322813 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.323480 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-httpd" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323506 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-httpd" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.323549 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-httpd" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323558 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-httpd" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.323587 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="init" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323595 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="init" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.323611 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-log" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323619 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-log" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.323633 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-log" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323641 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-log" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.323665 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323674 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323925 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323945 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-httpd" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323966 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-httpd" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323980 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" containerName="glance-log" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.323993 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" containerName="glance-log" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.325362 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.330721 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.338086 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.349327 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.349518 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.351308 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.353001 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.353310 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.353540 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.353678 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vfpx8" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.359097 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.371401 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.374748 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519004 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519126 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519150 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519186 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519214 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr5bd\" (UniqueName: \"kubernetes.io/projected/faac3725-9476-4d5c-b3a2-f927e4fe7af1-kube-api-access-gr5bd\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519236 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-config-data\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519334 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519455 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-logs\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519509 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fw9w\" (UniqueName: \"kubernetes.io/projected/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-kube-api-access-2fw9w\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519536 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519587 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519620 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519770 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-scripts\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519884 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.519987 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-logs\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.520035 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622174 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fw9w\" (UniqueName: \"kubernetes.io/projected/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-kube-api-access-2fw9w\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622250 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622286 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622320 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622359 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-scripts\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622393 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622433 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-logs\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622463 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622542 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622569 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622596 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622667 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622697 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr5bd\" (UniqueName: \"kubernetes.io/projected/faac3725-9476-4d5c-b3a2-f927e4fe7af1-kube-api-access-gr5bd\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622730 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-config-data\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622753 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622765 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.623102 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.623142 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.623458 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-logs\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.625160 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.630328 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-scripts\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.631204 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-config-data\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.622794 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-logs\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.631683 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-logs\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.633505 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.635917 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.638397 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.640530 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.640808 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.644305 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.645680 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fw9w\" (UniqueName: \"kubernetes.io/projected/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-kube-api-access-2fw9w\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.647952 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr5bd\" (UniqueName: \"kubernetes.io/projected/faac3725-9476-4d5c-b3a2-f927e4fe7af1-kube-api-access-gr5bd\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.663584 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.670973 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.682918 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.694711 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.782092 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.782307 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmgb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mtz29_openstack(bb094f7c-1527-476d-bf4a-d54a022320d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.783484 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mtz29" podUID="bb094f7c-1527-476d-bf4a-d54a022320d0" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.872389 4846 scope.go:117] "RemoveContainer" containerID="63c5c12d1a774b22a79999356df8ad8839fd11ade0d6a82ffd1cfbafab8c0d62" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.878917 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.904030 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.974981 4846 scope.go:117] "RemoveContainer" containerID="e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.986602 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dd4457b9-h4l2s" event={"ID":"a41b464c-76bd-4527-82a2-69408a024f68","Type":"ContainerDied","Data":"aae8805d76321b087277f8211b4f7e995c87db67d07b73f68d98f3fec505bc2b"} Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.986662 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dd4457b9-h4l2s" Nov 22 09:32:04 crc kubenswrapper[4846]: E1122 09:32:04.989276 4846 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/dnsmasq-dns-698758b865-jfr65_openstack_dnsmasq-dns-e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d.log: no such file or directory" path="/var/log/containers/dnsmasq-dns-698758b865-jfr65_openstack_dnsmasq-dns-e627d67b5ffbb2d432827153013d233f79bc6e2f4caddb0d54aeeaea2e1bc06d.log" Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.991349 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7965d89547-s58k6" event={"ID":"13948906-431d-4990-b5be-32a21b8113e9","Type":"ContainerDied","Data":"37c3a8e016eeb97594fde6775f1c2c8bed344a4da524dfbfc3c8fd4017521b28"} Nov 22 09:32:04 crc kubenswrapper[4846]: I1122 09:32:04.991425 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7965d89547-s58k6" Nov 22 09:32:05 crc kubenswrapper[4846]: E1122 09:32:05.010589 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mtz29" podUID="bb094f7c-1527-476d-bf4a-d54a022320d0" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039437 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-config-data\") pod \"a41b464c-76bd-4527-82a2-69408a024f68\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039485 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a41b464c-76bd-4527-82a2-69408a024f68-horizon-secret-key\") pod \"a41b464c-76bd-4527-82a2-69408a024f68\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039529 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13948906-431d-4990-b5be-32a21b8113e9-logs\") pod \"13948906-431d-4990-b5be-32a21b8113e9\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039606 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41b464c-76bd-4527-82a2-69408a024f68-logs\") pod \"a41b464c-76bd-4527-82a2-69408a024f68\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039674 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-config-data\") pod \"13948906-431d-4990-b5be-32a21b8113e9\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039765 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-scripts\") pod \"a41b464c-76bd-4527-82a2-69408a024f68\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039798 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/13948906-431d-4990-b5be-32a21b8113e9-horizon-secret-key\") pod \"13948906-431d-4990-b5be-32a21b8113e9\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039858 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-scripts\") pod \"13948906-431d-4990-b5be-32a21b8113e9\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039923 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr6lz\" (UniqueName: \"kubernetes.io/projected/a41b464c-76bd-4527-82a2-69408a024f68-kube-api-access-qr6lz\") pod \"a41b464c-76bd-4527-82a2-69408a024f68\" (UID: \"a41b464c-76bd-4527-82a2-69408a024f68\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.039969 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnp7w\" (UniqueName: \"kubernetes.io/projected/13948906-431d-4990-b5be-32a21b8113e9-kube-api-access-jnp7w\") pod \"13948906-431d-4990-b5be-32a21b8113e9\" (UID: \"13948906-431d-4990-b5be-32a21b8113e9\") " Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.040145 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13948906-431d-4990-b5be-32a21b8113e9-logs" (OuterVolumeSpecName: "logs") pod "13948906-431d-4990-b5be-32a21b8113e9" (UID: "13948906-431d-4990-b5be-32a21b8113e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.040397 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13948906-431d-4990-b5be-32a21b8113e9-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.040528 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41b464c-76bd-4527-82a2-69408a024f68-logs" (OuterVolumeSpecName: "logs") pod "a41b464c-76bd-4527-82a2-69408a024f68" (UID: "a41b464c-76bd-4527-82a2-69408a024f68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.040694 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-config-data" (OuterVolumeSpecName: "config-data") pod "a41b464c-76bd-4527-82a2-69408a024f68" (UID: "a41b464c-76bd-4527-82a2-69408a024f68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.040696 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-config-data" (OuterVolumeSpecName: "config-data") pod "13948906-431d-4990-b5be-32a21b8113e9" (UID: "13948906-431d-4990-b5be-32a21b8113e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.040993 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-scripts" (OuterVolumeSpecName: "scripts") pod "a41b464c-76bd-4527-82a2-69408a024f68" (UID: "a41b464c-76bd-4527-82a2-69408a024f68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.041373 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-scripts" (OuterVolumeSpecName: "scripts") pod "13948906-431d-4990-b5be-32a21b8113e9" (UID: "13948906-431d-4990-b5be-32a21b8113e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.045216 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13948906-431d-4990-b5be-32a21b8113e9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "13948906-431d-4990-b5be-32a21b8113e9" (UID: "13948906-431d-4990-b5be-32a21b8113e9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.045861 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41b464c-76bd-4527-82a2-69408a024f68-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a41b464c-76bd-4527-82a2-69408a024f68" (UID: "a41b464c-76bd-4527-82a2-69408a024f68"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.047249 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13948906-431d-4990-b5be-32a21b8113e9-kube-api-access-jnp7w" (OuterVolumeSpecName: "kube-api-access-jnp7w") pod "13948906-431d-4990-b5be-32a21b8113e9" (UID: "13948906-431d-4990-b5be-32a21b8113e9"). InnerVolumeSpecName "kube-api-access-jnp7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.051358 4846 scope.go:117] "RemoveContainer" containerID="3a254155271141f405240411c56b83b79ce7e9c56eba71fd3691f8b9caa7d78f" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.051451 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41b464c-76bd-4527-82a2-69408a024f68-kube-api-access-qr6lz" (OuterVolumeSpecName: "kube-api-access-qr6lz") pod "a41b464c-76bd-4527-82a2-69408a024f68" (UID: "a41b464c-76bd-4527-82a2-69408a024f68"). InnerVolumeSpecName "kube-api-access-qr6lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.093594 4846 scope.go:117] "RemoveContainer" containerID="c880279f4cb9bfa0f077b7d362584dd8ee5b7569978ec9cfc14ae595475448ed" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.138913 4846 scope.go:117] "RemoveContainer" containerID="926321a8d9f82c5a8e6a20ca747338af5847a81a165c77d1c2fe39ecfb14d552" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142495 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142536 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr6lz\" (UniqueName: \"kubernetes.io/projected/a41b464c-76bd-4527-82a2-69408a024f68-kube-api-access-qr6lz\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142550 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnp7w\" (UniqueName: \"kubernetes.io/projected/13948906-431d-4990-b5be-32a21b8113e9-kube-api-access-jnp7w\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142562 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142576 4846 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a41b464c-76bd-4527-82a2-69408a024f68-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142587 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41b464c-76bd-4527-82a2-69408a024f68-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142597 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13948906-431d-4990-b5be-32a21b8113e9-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142607 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a41b464c-76bd-4527-82a2-69408a024f68-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.142620 4846 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/13948906-431d-4990-b5be-32a21b8113e9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.340900 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dfd5ccb4b-fpl7v"] Nov 22 09:32:05 crc kubenswrapper[4846]: W1122 09:32:05.375666 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79042af_3413_4614_a787_72fdd7fc91d7.slice/crio-5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9 WatchSource:0}: Error finding container 5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9: Status 404 returned error can't find the container with id 5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9 Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.404404 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8564f79874-c88vw"] Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.413860 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7965d89547-s58k6"] Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.422102 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7965d89547-s58k6"] Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.449249 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58dd4457b9-h4l2s"] Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.463559 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58dd4457b9-h4l2s"] Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.618686 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wjc4l"] Nov 22 09:32:05 crc kubenswrapper[4846]: I1122 09:32:05.980285 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.000469 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjc4l" event={"ID":"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d","Type":"ContainerStarted","Data":"ed3a9f30db2eafb70efa74708796a5be2d4b8ad98a9cadf4f36e780439416ac2"} Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.000516 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjc4l" event={"ID":"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d","Type":"ContainerStarted","Data":"910eb6829fff3930a315eea6beeef2265c39253d136f8bcfa0cf80f389053d59"} Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.006567 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfd5ccb4b-fpl7v" event={"ID":"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3","Type":"ContainerStarted","Data":"4fb2535be5468bb209fcf5d3798eb9642b46f3bdf69e5b46e1605d4d6d6aa54a"} Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.007849 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh58j" event={"ID":"49c8a1c0-2155-4d68-971a-e68aff9e5133","Type":"ContainerStarted","Data":"0786d84e5c9d01386fe2c06bfab25d155d8a0d396f061bc48524ba5448031ebd"} Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.008902 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8564f79874-c88vw" event={"ID":"f79042af-3413-4614-a787-72fdd7fc91d7","Type":"ContainerStarted","Data":"5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9"} Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.032988 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wjc4l" podStartSLOduration=3.032969442 podStartE2EDuration="3.032969442s" podCreationTimestamp="2025-11-22 09:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:06.021818776 +0000 UTC m=+1100.957508425" watchObservedRunningTime="2025-11-22 09:32:06.032969442 +0000 UTC m=+1100.968659081" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.049608 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xh58j" podStartSLOduration=2.646777077 podStartE2EDuration="50.049590529s" podCreationTimestamp="2025-11-22 09:31:16 +0000 UTC" firstStartedPulling="2025-11-22 09:31:17.457579887 +0000 UTC m=+1052.393269536" lastFinishedPulling="2025-11-22 09:32:04.860393339 +0000 UTC m=+1099.796082988" observedRunningTime="2025-11-22 09:32:06.048980751 +0000 UTC m=+1100.984670410" watchObservedRunningTime="2025-11-22 09:32:06.049590529 +0000 UTC m=+1100.985280178" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.060746 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13948906-431d-4990-b5be-32a21b8113e9" path="/var/lib/kubelet/pods/13948906-431d-4990-b5be-32a21b8113e9/volumes" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.061329 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240e8184-1e6d-4b28-bd27-80bb8e200f3f" path="/var/lib/kubelet/pods/240e8184-1e6d-4b28-bd27-80bb8e200f3f/volumes" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.062107 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" path="/var/lib/kubelet/pods/35af4ff6-1042-4f6d-93da-bfb4f43fd04d/volumes" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.063633 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41b464c-76bd-4527-82a2-69408a024f68" path="/var/lib/kubelet/pods/a41b464c-76bd-4527-82a2-69408a024f68/volumes" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.064171 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73a3301-2360-46f7-ae88-9dd9c93f0fb7" path="/var/lib/kubelet/pods/a73a3301-2360-46f7-ae88-9dd9c93f0fb7/volumes" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.065074 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b50139-f7ff-477a-8f59-5b72e0413206" path="/var/lib/kubelet/pods/e3b50139-f7ff-477a-8f59-5b72e0413206/volumes" Nov 22 09:32:06 crc kubenswrapper[4846]: W1122 09:32:06.151130 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe5756d_eccc_4a1b_807b_7a5cd0962ea0.slice/crio-fa65ba9297be4f8c30299cfa79df1204052ace2022e619bdcbde6129a9372513 WatchSource:0}: Error finding container fa65ba9297be4f8c30299cfa79df1204052ace2022e619bdcbde6129a9372513: Status 404 returned error can't find the container with id fa65ba9297be4f8c30299cfa79df1204052ace2022e619bdcbde6129a9372513 Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.472637 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jfr65" podUID="35af4ff6-1042-4f6d-93da-bfb4f43fd04d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Nov 22 09:32:06 crc kubenswrapper[4846]: I1122 09:32:06.849363 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:32:06 crc kubenswrapper[4846]: W1122 09:32:06.856344 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaac3725_9476_4d5c_b3a2_f927e4fe7af1.slice/crio-731c36e606ec67f500855762b5a8744298f51eacec234e86e310165e3d73c31d WatchSource:0}: Error finding container 731c36e606ec67f500855762b5a8744298f51eacec234e86e310165e3d73c31d: Status 404 returned error can't find the container with id 731c36e606ec67f500855762b5a8744298f51eacec234e86e310165e3d73c31d Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.026860 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faac3725-9476-4d5c-b3a2-f927e4fe7af1","Type":"ContainerStarted","Data":"731c36e606ec67f500855762b5a8744298f51eacec234e86e310165e3d73c31d"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.031083 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerStarted","Data":"b38ada579bde7c762ec25c3cfe135c3b599a86375422c7b9dcf9c19adb47d72f"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.033860 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfd5ccb4b-fpl7v" event={"ID":"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3","Type":"ContainerStarted","Data":"5c9aaa8bbb5ce6f17c1bfb5513663dc4968b6425cde638ad10c81a34967a5c1d"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.033933 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dfd5ccb4b-fpl7v" event={"ID":"76c862f1-2cb3-4598-9be8-f8ff8bbab6f3","Type":"ContainerStarted","Data":"c3b8a4854432a3d56dfe70e83fba86eb923ef6ab7c5a659c4835417e65c3b3c2"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.045633 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8564f79874-c88vw" event={"ID":"f79042af-3413-4614-a787-72fdd7fc91d7","Type":"ContainerStarted","Data":"5a55e9de128b8e237f5cd8fabafd175065c25d630198e7d28d8c6d6779e35778"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.045701 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8564f79874-c88vw" event={"ID":"f79042af-3413-4614-a787-72fdd7fc91d7","Type":"ContainerStarted","Data":"27001cb788644216b3d4184a41d7a2c4b77ab48f03fe634319d55ff847ebacea"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.058280 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0","Type":"ContainerStarted","Data":"0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.058370 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0","Type":"ContainerStarted","Data":"fa65ba9297be4f8c30299cfa79df1204052ace2022e619bdcbde6129a9372513"} Nov 22 09:32:07 crc kubenswrapper[4846]: I1122 09:32:07.075195 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dfd5ccb4b-fpl7v" podStartSLOduration=41.141472911 podStartE2EDuration="42.075163155s" podCreationTimestamp="2025-11-22 09:31:25 +0000 UTC" firstStartedPulling="2025-11-22 09:32:05.364861209 +0000 UTC m=+1100.300550858" lastFinishedPulling="2025-11-22 09:32:06.298551453 +0000 UTC m=+1101.234241102" observedRunningTime="2025-11-22 09:32:07.066263235 +0000 UTC m=+1102.001952894" watchObservedRunningTime="2025-11-22 09:32:07.075163155 +0000 UTC m=+1102.010852804" Nov 22 09:32:08 crc kubenswrapper[4846]: I1122 09:32:08.100609 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0","Type":"ContainerStarted","Data":"ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077"} Nov 22 09:32:08 crc kubenswrapper[4846]: I1122 09:32:08.114738 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faac3725-9476-4d5c-b3a2-f927e4fe7af1","Type":"ContainerStarted","Data":"994e2e776f1e64c9a74b2724543c9d1fd2373e411e5de36b390558e153cd751d"} Nov 22 09:32:08 crc kubenswrapper[4846]: I1122 09:32:08.129538 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8564f79874-c88vw" podStartSLOduration=42.214323591 podStartE2EDuration="43.129501213s" podCreationTimestamp="2025-11-22 09:31:25 +0000 UTC" firstStartedPulling="2025-11-22 09:32:05.379135627 +0000 UTC m=+1100.314825286" lastFinishedPulling="2025-11-22 09:32:06.294313259 +0000 UTC m=+1101.230002908" observedRunningTime="2025-11-22 09:32:07.101364513 +0000 UTC m=+1102.037054162" watchObservedRunningTime="2025-11-22 09:32:08.129501213 +0000 UTC m=+1103.065190862" Nov 22 09:32:08 crc kubenswrapper[4846]: I1122 09:32:08.134995 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.134963723 podStartE2EDuration="4.134963723s" podCreationTimestamp="2025-11-22 09:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:08.125366252 +0000 UTC m=+1103.061055901" watchObservedRunningTime="2025-11-22 09:32:08.134963723 +0000 UTC m=+1103.070653372" Nov 22 09:32:09 crc kubenswrapper[4846]: I1122 09:32:09.141679 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faac3725-9476-4d5c-b3a2-f927e4fe7af1","Type":"ContainerStarted","Data":"24840462d593fd5e9db8ae83104c7d8c71a5b51e0c96fa0c54e9d5765805d224"} Nov 22 09:32:09 crc kubenswrapper[4846]: I1122 09:32:09.189361 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.189327862 podStartE2EDuration="5.189327862s" podCreationTimestamp="2025-11-22 09:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:09.173497818 +0000 UTC m=+1104.109187467" watchObservedRunningTime="2025-11-22 09:32:09.189327862 +0000 UTC m=+1104.125017521" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.684115 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.685724 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.695003 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.695101 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.721623 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.736732 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.739529 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 09:32:14 crc kubenswrapper[4846]: I1122 09:32:14.764709 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:15 crc kubenswrapper[4846]: I1122 09:32:15.218787 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:15 crc kubenswrapper[4846]: I1122 09:32:15.219020 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:32:15 crc kubenswrapper[4846]: I1122 09:32:15.219040 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:32:15 crc kubenswrapper[4846]: I1122 09:32:15.219411 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:15 crc kubenswrapper[4846]: I1122 09:32:15.898739 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:32:15 crc kubenswrapper[4846]: I1122 09:32:15.899195 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:32:16 crc kubenswrapper[4846]: I1122 09:32:16.024698 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:32:16 crc kubenswrapper[4846]: I1122 09:32:16.024768 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:32:16 crc kubenswrapper[4846]: I1122 09:32:16.027223 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5dfd5ccb4b-fpl7v" podUID="76c862f1-2cb3-4598-9be8-f8ff8bbab6f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.238198 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.238679 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.294580 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.531616 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.532680 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.532850 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:32:17 crc kubenswrapper[4846]: I1122 09:32:17.966465 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.300878 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mtz29" event={"ID":"bb094f7c-1527-476d-bf4a-d54a022320d0","Type":"ContainerStarted","Data":"5672aca4aadda66f02b97cdd31f0cb5bde14c5316521dcf7ace45e695c6e8ab7"} Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.307030 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgp99" event={"ID":"083da0b8-38d6-4eab-b211-8389df97a0a8","Type":"ContainerStarted","Data":"4f85422a8125cdaf852b99a918947ac6379240d75c79121f17a573a8bb7927ce"} Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.310415 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerStarted","Data":"501a43069e6df8f9e9414e97cd51203879e497e2c20e22470f604dadaddf2ed6"} Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.312941 4846 generic.go:334] "Generic (PLEG): container finished" podID="cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" containerID="ed3a9f30db2eafb70efa74708796a5be2d4b8ad98a9cadf4f36e780439416ac2" exitCode=0 Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.313013 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjc4l" event={"ID":"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d","Type":"ContainerDied","Data":"ed3a9f30db2eafb70efa74708796a5be2d4b8ad98a9cadf4f36e780439416ac2"} Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.331292 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mtz29" podStartSLOduration=3.175912356 podStartE2EDuration="1m6.331269507s" podCreationTimestamp="2025-11-22 09:31:16 +0000 UTC" firstStartedPulling="2025-11-22 09:31:17.912804812 +0000 UTC m=+1052.848494461" lastFinishedPulling="2025-11-22 09:32:21.068161973 +0000 UTC m=+1116.003851612" observedRunningTime="2025-11-22 09:32:22.321090579 +0000 UTC m=+1117.256780228" watchObservedRunningTime="2025-11-22 09:32:22.331269507 +0000 UTC m=+1117.266959156" Nov 22 09:32:22 crc kubenswrapper[4846]: I1122 09:32:22.374312 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xgp99" podStartSLOduration=4.096650635 podStartE2EDuration="1m7.374288538s" podCreationTimestamp="2025-11-22 09:31:15 +0000 UTC" firstStartedPulling="2025-11-22 09:31:17.785235215 +0000 UTC m=+1052.720924864" lastFinishedPulling="2025-11-22 09:32:21.062873118 +0000 UTC m=+1115.998562767" observedRunningTime="2025-11-22 09:32:22.369123776 +0000 UTC m=+1117.304813445" watchObservedRunningTime="2025-11-22 09:32:22.374288538 +0000 UTC m=+1117.309978187" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.326345 4846 generic.go:334] "Generic (PLEG): container finished" podID="49c8a1c0-2155-4d68-971a-e68aff9e5133" containerID="0786d84e5c9d01386fe2c06bfab25d155d8a0d396f061bc48524ba5448031ebd" exitCode=0 Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.326378 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh58j" event={"ID":"49c8a1c0-2155-4d68-971a-e68aff9e5133","Type":"ContainerDied","Data":"0786d84e5c9d01386fe2c06bfab25d155d8a0d396f061bc48524ba5448031ebd"} Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.646432 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.706564 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s84cw\" (UniqueName: \"kubernetes.io/projected/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-kube-api-access-s84cw\") pod \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.706618 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-combined-ca-bundle\") pod \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.706649 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-credential-keys\") pod \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.706685 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-config-data\") pod \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.706959 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-scripts\") pod \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.706990 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-fernet-keys\") pod \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\" (UID: \"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d\") " Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.715381 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-kube-api-access-s84cw" (OuterVolumeSpecName: "kube-api-access-s84cw") pod "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" (UID: "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d"). InnerVolumeSpecName "kube-api-access-s84cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.715452 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" (UID: "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.717468 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-scripts" (OuterVolumeSpecName: "scripts") pod "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" (UID: "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.727270 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" (UID: "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.757186 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-config-data" (OuterVolumeSpecName: "config-data") pod "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" (UID: "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.783617 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" (UID: "cc88a2ff-257e-4a2b-81b5-e35f78e77a1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.809555 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s84cw\" (UniqueName: \"kubernetes.io/projected/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-kube-api-access-s84cw\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.809589 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.809598 4846 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.809607 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.809615 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:23 crc kubenswrapper[4846]: I1122 09:32:23.809625 4846 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.346479 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjc4l" event={"ID":"cc88a2ff-257e-4a2b-81b5-e35f78e77a1d","Type":"ContainerDied","Data":"910eb6829fff3930a315eea6beeef2265c39253d136f8bcfa0cf80f389053d59"} Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.346546 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="910eb6829fff3930a315eea6beeef2265c39253d136f8bcfa0cf80f389053d59" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.346515 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjc4l" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.575638 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5ccd94b5cf-fd5rp"] Nov 22 09:32:24 crc kubenswrapper[4846]: E1122 09:32:24.577298 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" containerName="keystone-bootstrap" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.577317 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" containerName="keystone-bootstrap" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.578815 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" containerName="keystone-bootstrap" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.608012 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ccd94b5cf-fd5rp"] Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.608174 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.612985 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.620884 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-d54gt" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.621178 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.621405 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.621599 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.629682 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732084 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-config-data\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732179 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-public-tls-certs\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732281 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ssp7\" (UniqueName: \"kubernetes.io/projected/fe29ba72-dfe7-4536-bf56-c282d31d2acb-kube-api-access-2ssp7\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732514 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-fernet-keys\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732696 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-scripts\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732747 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-combined-ca-bundle\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.732989 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh58j" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.733005 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-credential-keys\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.733216 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-internal-tls-certs\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.835073 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-scripts\") pod \"49c8a1c0-2155-4d68-971a-e68aff9e5133\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.835163 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c8a1c0-2155-4d68-971a-e68aff9e5133-logs\") pod \"49c8a1c0-2155-4d68-971a-e68aff9e5133\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.835280 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjstk\" (UniqueName: \"kubernetes.io/projected/49c8a1c0-2155-4d68-971a-e68aff9e5133-kube-api-access-xjstk\") pod \"49c8a1c0-2155-4d68-971a-e68aff9e5133\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.835415 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-config-data\") pod \"49c8a1c0-2155-4d68-971a-e68aff9e5133\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.835452 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-combined-ca-bundle\") pod \"49c8a1c0-2155-4d68-971a-e68aff9e5133\" (UID: \"49c8a1c0-2155-4d68-971a-e68aff9e5133\") " Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.835840 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ssp7\" (UniqueName: \"kubernetes.io/projected/fe29ba72-dfe7-4536-bf56-c282d31d2acb-kube-api-access-2ssp7\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.836206 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49c8a1c0-2155-4d68-971a-e68aff9e5133-logs" (OuterVolumeSpecName: "logs") pod "49c8a1c0-2155-4d68-971a-e68aff9e5133" (UID: "49c8a1c0-2155-4d68-971a-e68aff9e5133"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.836547 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-fernet-keys\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.836885 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-scripts\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.837121 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-combined-ca-bundle\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.837300 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-credential-keys\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.837402 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-internal-tls-certs\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.837640 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-config-data\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.837764 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-public-tls-certs\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.837951 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c8a1c0-2155-4d68-971a-e68aff9e5133-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.845800 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-scripts" (OuterVolumeSpecName: "scripts") pod "49c8a1c0-2155-4d68-971a-e68aff9e5133" (UID: "49c8a1c0-2155-4d68-971a-e68aff9e5133"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.845952 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c8a1c0-2155-4d68-971a-e68aff9e5133-kube-api-access-xjstk" (OuterVolumeSpecName: "kube-api-access-xjstk") pod "49c8a1c0-2155-4d68-971a-e68aff9e5133" (UID: "49c8a1c0-2155-4d68-971a-e68aff9e5133"). InnerVolumeSpecName "kube-api-access-xjstk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.848271 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-config-data\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.848483 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-public-tls-certs\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.848534 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-combined-ca-bundle\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.848824 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-credential-keys\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.850394 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-scripts\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.854769 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-fernet-keys\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.863352 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe29ba72-dfe7-4536-bf56-c282d31d2acb-internal-tls-certs\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.863959 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ssp7\" (UniqueName: \"kubernetes.io/projected/fe29ba72-dfe7-4536-bf56-c282d31d2acb-kube-api-access-2ssp7\") pod \"keystone-5ccd94b5cf-fd5rp\" (UID: \"fe29ba72-dfe7-4536-bf56-c282d31d2acb\") " pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.872889 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-config-data" (OuterVolumeSpecName: "config-data") pod "49c8a1c0-2155-4d68-971a-e68aff9e5133" (UID: "49c8a1c0-2155-4d68-971a-e68aff9e5133"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.884166 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49c8a1c0-2155-4d68-971a-e68aff9e5133" (UID: "49c8a1c0-2155-4d68-971a-e68aff9e5133"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.944340 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.944382 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.944397 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8a1c0-2155-4d68-971a-e68aff9e5133-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:24 crc kubenswrapper[4846]: I1122 09:32:24.944409 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjstk\" (UniqueName: \"kubernetes.io/projected/49c8a1c0-2155-4d68-971a-e68aff9e5133-kube-api-access-xjstk\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.023576 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.357116 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh58j" event={"ID":"49c8a1c0-2155-4d68-971a-e68aff9e5133","Type":"ContainerDied","Data":"72efe662024f5da8f71db0a78c50ddd0d820860a7dc56628e27bf29ef8953527"} Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.357531 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72efe662024f5da8f71db0a78c50ddd0d820860a7dc56628e27bf29ef8953527" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.357217 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh58j" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.482039 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-569956d6b4-jtk8r"] Nov 22 09:32:25 crc kubenswrapper[4846]: E1122 09:32:25.482571 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c8a1c0-2155-4d68-971a-e68aff9e5133" containerName="placement-db-sync" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.482588 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c8a1c0-2155-4d68-971a-e68aff9e5133" containerName="placement-db-sync" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.482754 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c8a1c0-2155-4d68-971a-e68aff9e5133" containerName="placement-db-sync" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.483796 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.493339 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.493695 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.495005 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.495089 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.495266 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4l2sw" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.503210 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-569956d6b4-jtk8r"] Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.553291 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ccd94b5cf-fd5rp"] Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.556654 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-config-data\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.556832 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-logs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.556950 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-combined-ca-bundle\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.557069 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-scripts\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.557096 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-internal-tls-certs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.557159 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-public-tls-certs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.557292 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rrd\" (UniqueName: \"kubernetes.io/projected/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-kube-api-access-b7rrd\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659619 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-public-tls-certs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659706 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rrd\" (UniqueName: \"kubernetes.io/projected/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-kube-api-access-b7rrd\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659797 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-config-data\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659850 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-logs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659879 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-combined-ca-bundle\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659916 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-scripts\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.659937 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-internal-tls-certs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.666665 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-internal-tls-certs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.667071 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-public-tls-certs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.668033 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-logs\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.668641 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-config-data\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.668757 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-scripts\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.668766 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-combined-ca-bundle\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.679876 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rrd\" (UniqueName: \"kubernetes.io/projected/2f8c4b78-83b6-4f98-a4e2-ef7f56043775-kube-api-access-b7rrd\") pod \"placement-569956d6b4-jtk8r\" (UID: \"2f8c4b78-83b6-4f98-a4e2-ef7f56043775\") " pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.802958 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:25 crc kubenswrapper[4846]: I1122 09:32:25.900019 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 22 09:32:26 crc kubenswrapper[4846]: I1122 09:32:26.024517 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5dfd5ccb4b-fpl7v" podUID="76c862f1-2cb3-4598-9be8-f8ff8bbab6f3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Nov 22 09:32:28 crc kubenswrapper[4846]: I1122 09:32:28.625942 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:32:28 crc kubenswrapper[4846]: I1122 09:32:28.626437 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:32:29 crc kubenswrapper[4846]: W1122 09:32:29.179534 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe29ba72_dfe7_4536_bf56_c282d31d2acb.slice/crio-75d56647af60284f0f618e738da4dfafe5cb5a35676bcd4d8b07b4e9fa36333e WatchSource:0}: Error finding container 75d56647af60284f0f618e738da4dfafe5cb5a35676bcd4d8b07b4e9fa36333e: Status 404 returned error can't find the container with id 75d56647af60284f0f618e738da4dfafe5cb5a35676bcd4d8b07b4e9fa36333e Nov 22 09:32:29 crc kubenswrapper[4846]: I1122 09:32:29.398892 4846 generic.go:334] "Generic (PLEG): container finished" podID="bb094f7c-1527-476d-bf4a-d54a022320d0" containerID="5672aca4aadda66f02b97cdd31f0cb5bde14c5316521dcf7ace45e695c6e8ab7" exitCode=0 Nov 22 09:32:29 crc kubenswrapper[4846]: I1122 09:32:29.398978 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mtz29" event={"ID":"bb094f7c-1527-476d-bf4a-d54a022320d0","Type":"ContainerDied","Data":"5672aca4aadda66f02b97cdd31f0cb5bde14c5316521dcf7ace45e695c6e8ab7"} Nov 22 09:32:29 crc kubenswrapper[4846]: I1122 09:32:29.403536 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ccd94b5cf-fd5rp" event={"ID":"fe29ba72-dfe7-4536-bf56-c282d31d2acb","Type":"ContainerStarted","Data":"75d56647af60284f0f618e738da4dfafe5cb5a35676bcd4d8b07b4e9fa36333e"} Nov 22 09:32:30 crc kubenswrapper[4846]: I1122 09:32:30.859944 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mtz29" Nov 22 09:32:30 crc kubenswrapper[4846]: I1122 09:32:30.965778 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmgb5\" (UniqueName: \"kubernetes.io/projected/bb094f7c-1527-476d-bf4a-d54a022320d0-kube-api-access-jmgb5\") pod \"bb094f7c-1527-476d-bf4a-d54a022320d0\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " Nov 22 09:32:30 crc kubenswrapper[4846]: I1122 09:32:30.965931 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-combined-ca-bundle\") pod \"bb094f7c-1527-476d-bf4a-d54a022320d0\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " Nov 22 09:32:30 crc kubenswrapper[4846]: I1122 09:32:30.968607 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-db-sync-config-data\") pod \"bb094f7c-1527-476d-bf4a-d54a022320d0\" (UID: \"bb094f7c-1527-476d-bf4a-d54a022320d0\") " Nov 22 09:32:30 crc kubenswrapper[4846]: I1122 09:32:30.972901 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb094f7c-1527-476d-bf4a-d54a022320d0-kube-api-access-jmgb5" (OuterVolumeSpecName: "kube-api-access-jmgb5") pod "bb094f7c-1527-476d-bf4a-d54a022320d0" (UID: "bb094f7c-1527-476d-bf4a-d54a022320d0"). InnerVolumeSpecName "kube-api-access-jmgb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:30 crc kubenswrapper[4846]: I1122 09:32:30.973407 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bb094f7c-1527-476d-bf4a-d54a022320d0" (UID: "bb094f7c-1527-476d-bf4a-d54a022320d0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:30 crc kubenswrapper[4846]: E1122 09:32:30.998635 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.002079 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb094f7c-1527-476d-bf4a-d54a022320d0" (UID: "bb094f7c-1527-476d-bf4a-d54a022320d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.072513 4846 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.072575 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmgb5\" (UniqueName: \"kubernetes.io/projected/bb094f7c-1527-476d-bf4a-d54a022320d0-kube-api-access-jmgb5\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.072589 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb094f7c-1527-476d-bf4a-d54a022320d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.187133 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-569956d6b4-jtk8r"] Nov 22 09:32:31 crc kubenswrapper[4846]: W1122 09:32:31.190316 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f8c4b78_83b6_4f98_a4e2_ef7f56043775.slice/crio-edca5e5ac25491eaadeb573321a757ebe792c1d6101535b05dfbdb5ede95aa57 WatchSource:0}: Error finding container edca5e5ac25491eaadeb573321a757ebe792c1d6101535b05dfbdb5ede95aa57: Status 404 returned error can't find the container with id edca5e5ac25491eaadeb573321a757ebe792c1d6101535b05dfbdb5ede95aa57 Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.430507 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mtz29" event={"ID":"bb094f7c-1527-476d-bf4a-d54a022320d0","Type":"ContainerDied","Data":"001fc2524290ea1fccdfdb136498deccdb2c92898c7c0ff0479e17908071df43"} Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.430555 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001fc2524290ea1fccdfdb136498deccdb2c92898c7c0ff0479e17908071df43" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.430617 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mtz29" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.434077 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-569956d6b4-jtk8r" event={"ID":"2f8c4b78-83b6-4f98-a4e2-ef7f56043775","Type":"ContainerStarted","Data":"6439838de235c1ca36a9bed7757c44e697f93f153fac1ab594ebc77fdf23bb57"} Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.434147 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-569956d6b4-jtk8r" event={"ID":"2f8c4b78-83b6-4f98-a4e2-ef7f56043775","Type":"ContainerStarted","Data":"edca5e5ac25491eaadeb573321a757ebe792c1d6101535b05dfbdb5ede95aa57"} Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.444851 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerStarted","Data":"857e8b9de54bbc45a991eec7cc8d37c9ec5c159ad9fe02fe5c8c166dd67d35ac"} Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.445116 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="ceilometer-notification-agent" containerID="cri-o://b38ada579bde7c762ec25c3cfe135c3b599a86375422c7b9dcf9c19adb47d72f" gracePeriod=30 Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.445530 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.445882 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="proxy-httpd" containerID="cri-o://857e8b9de54bbc45a991eec7cc8d37c9ec5c159ad9fe02fe5c8c166dd67d35ac" gracePeriod=30 Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.445948 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="sg-core" containerID="cri-o://501a43069e6df8f9e9414e97cd51203879e497e2c20e22470f604dadaddf2ed6" gracePeriod=30 Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.458608 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ccd94b5cf-fd5rp" event={"ID":"fe29ba72-dfe7-4536-bf56-c282d31d2acb","Type":"ContainerStarted","Data":"2ce24926780d6fd0377e762c20f0b9961b61b8aec2f7964804f6be8baf0d58b8"} Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.458895 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.739188 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5ccd94b5cf-fd5rp" podStartSLOduration=7.739170538 podStartE2EDuration="7.739170538s" podCreationTimestamp="2025-11-22 09:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:31.506553413 +0000 UTC m=+1126.442243062" watchObservedRunningTime="2025-11-22 09:32:31.739170538 +0000 UTC m=+1126.674860187" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.746418 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5ff9f749db-lj4qc"] Nov 22 09:32:31 crc kubenswrapper[4846]: E1122 09:32:31.746942 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb094f7c-1527-476d-bf4a-d54a022320d0" containerName="barbican-db-sync" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.746966 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb094f7c-1527-476d-bf4a-d54a022320d0" containerName="barbican-db-sync" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.747308 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb094f7c-1527-476d-bf4a-d54a022320d0" containerName="barbican-db-sync" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.754155 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.763669 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.763775 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cq95r" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.763681 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.799221 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5ff9f749db-lj4qc"] Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.804599 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d45bb639-d116-4666-8aea-ba5bc8ca84ea-logs\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.804646 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-config-data\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.804779 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87lv\" (UniqueName: \"kubernetes.io/projected/d45bb639-d116-4666-8aea-ba5bc8ca84ea-kube-api-access-w87lv\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.804804 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-combined-ca-bundle\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.804906 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-config-data-custom\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.821108 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5b796967d9-trff5"] Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.822947 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.825420 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.827181 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5b796967d9-trff5"] Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918037 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-config-data\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918134 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-config-data-custom\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918168 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03409e82-9b6d-43ee-a770-96700e162fac-logs\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918193 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d45bb639-d116-4666-8aea-ba5bc8ca84ea-logs\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918218 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-config-data\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918261 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtv4f\" (UniqueName: \"kubernetes.io/projected/03409e82-9b6d-43ee-a770-96700e162fac-kube-api-access-xtv4f\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918412 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87lv\" (UniqueName: \"kubernetes.io/projected/d45bb639-d116-4666-8aea-ba5bc8ca84ea-kube-api-access-w87lv\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918434 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-combined-ca-bundle\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918563 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-combined-ca-bundle\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.918664 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-config-data-custom\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.922615 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d45bb639-d116-4666-8aea-ba5bc8ca84ea-logs\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.942524 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-combined-ca-bundle\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.945799 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-config-data\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.959964 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87lv\" (UniqueName: \"kubernetes.io/projected/d45bb639-d116-4666-8aea-ba5bc8ca84ea-kube-api-access-w87lv\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.960289 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-htqsr"] Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.964126 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.971780 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d45bb639-d116-4666-8aea-ba5bc8ca84ea-config-data-custom\") pod \"barbican-keystone-listener-5ff9f749db-lj4qc\" (UID: \"d45bb639-d116-4666-8aea-ba5bc8ca84ea\") " pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:31 crc kubenswrapper[4846]: I1122 09:32:31.993133 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-htqsr"] Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020408 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-config-data-custom\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020476 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03409e82-9b6d-43ee-a770-96700e162fac-logs\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020523 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtv4f\" (UniqueName: \"kubernetes.io/projected/03409e82-9b6d-43ee-a770-96700e162fac-kube-api-access-xtv4f\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020615 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020669 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndp9\" (UniqueName: \"kubernetes.io/projected/ddba369c-0b60-4702-a6d9-7818a562f677-kube-api-access-8ndp9\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020716 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020744 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-config\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020780 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-combined-ca-bundle\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020856 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020886 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.020967 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-config-data\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.024640 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5746494d7d-54slg"] Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.030103 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03409e82-9b6d-43ee-a770-96700e162fac-logs\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.036935 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-combined-ca-bundle\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.038221 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.038905 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-config-data-custom\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.043488 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.061928 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtv4f\" (UniqueName: \"kubernetes.io/projected/03409e82-9b6d-43ee-a770-96700e162fac-kube-api-access-xtv4f\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.066409 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03409e82-9b6d-43ee-a770-96700e162fac-config-data\") pod \"barbican-worker-5b796967d9-trff5\" (UID: \"03409e82-9b6d-43ee-a770-96700e162fac\") " pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.079264 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.089556 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5746494d7d-54slg"] Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.127426 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.127751 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.127972 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-combined-ca-bundle\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128184 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfc4\" (UniqueName: \"kubernetes.io/projected/2ff9a258-1c0d-4003-8261-8664a42d0091-kube-api-access-fgfc4\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128433 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128528 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128664 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndp9\" (UniqueName: \"kubernetes.io/projected/ddba369c-0b60-4702-a6d9-7818a562f677-kube-api-access-8ndp9\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128804 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128852 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-config\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128865 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.128877 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff9a258-1c0d-4003-8261-8664a42d0091-logs\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.129117 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data-custom\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.129735 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.134176 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-config\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.134462 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.135761 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.149455 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndp9\" (UniqueName: \"kubernetes.io/projected/ddba369c-0b60-4702-a6d9-7818a562f677-kube-api-access-8ndp9\") pod \"dnsmasq-dns-586bdc5f9-htqsr\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.155221 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b796967d9-trff5" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.158816 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.234223 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.238062 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff9a258-1c0d-4003-8261-8664a42d0091-logs\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.238114 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data-custom\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.238309 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-combined-ca-bundle\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.238422 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfc4\" (UniqueName: \"kubernetes.io/projected/2ff9a258-1c0d-4003-8261-8664a42d0091-kube-api-access-fgfc4\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.238935 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff9a258-1c0d-4003-8261-8664a42d0091-logs\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.245023 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data-custom\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.245690 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.251844 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-combined-ca-bundle\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.266387 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfc4\" (UniqueName: \"kubernetes.io/projected/2ff9a258-1c0d-4003-8261-8664a42d0091-kube-api-access-fgfc4\") pod \"barbican-api-5746494d7d-54slg\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.487978 4846 generic.go:334] "Generic (PLEG): container finished" podID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerID="857e8b9de54bbc45a991eec7cc8d37c9ec5c159ad9fe02fe5c8c166dd67d35ac" exitCode=0 Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.488450 4846 generic.go:334] "Generic (PLEG): container finished" podID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerID="501a43069e6df8f9e9414e97cd51203879e497e2c20e22470f604dadaddf2ed6" exitCode=2 Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.488068 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerDied","Data":"857e8b9de54bbc45a991eec7cc8d37c9ec5c159ad9fe02fe5c8c166dd67d35ac"} Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.488606 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerDied","Data":"501a43069e6df8f9e9414e97cd51203879e497e2c20e22470f604dadaddf2ed6"} Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.498098 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-569956d6b4-jtk8r" event={"ID":"2f8c4b78-83b6-4f98-a4e2-ef7f56043775","Type":"ContainerStarted","Data":"76555cb682a19f1a1e6660d7ca5ba28e45f936cf8201612504a6abce7ee8af7a"} Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.498258 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.498307 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.502661 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.539420 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-569956d6b4-jtk8r" podStartSLOduration=7.539384532 podStartE2EDuration="7.539384532s" podCreationTimestamp="2025-11-22 09:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:32.525596378 +0000 UTC m=+1127.461286027" watchObservedRunningTime="2025-11-22 09:32:32.539384532 +0000 UTC m=+1127.475074181" Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.814680 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-htqsr"] Nov 22 09:32:32 crc kubenswrapper[4846]: I1122 09:32:32.970700 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5ff9f749db-lj4qc"] Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.009521 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5b796967d9-trff5"] Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.072658 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5746494d7d-54slg"] Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.510918 4846 generic.go:334] "Generic (PLEG): container finished" podID="083da0b8-38d6-4eab-b211-8389df97a0a8" containerID="4f85422a8125cdaf852b99a918947ac6379240d75c79121f17a573a8bb7927ce" exitCode=0 Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.511495 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgp99" event={"ID":"083da0b8-38d6-4eab-b211-8389df97a0a8","Type":"ContainerDied","Data":"4f85422a8125cdaf852b99a918947ac6379240d75c79121f17a573a8bb7927ce"} Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.526068 4846 generic.go:334] "Generic (PLEG): container finished" podID="ddba369c-0b60-4702-a6d9-7818a562f677" containerID="d248bd88abc8308bb2cb2eaf91f43daf27656501bc157b2070b5efeed3598732" exitCode=0 Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.526202 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" event={"ID":"ddba369c-0b60-4702-a6d9-7818a562f677","Type":"ContainerDied","Data":"d248bd88abc8308bb2cb2eaf91f43daf27656501bc157b2070b5efeed3598732"} Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.526270 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" event={"ID":"ddba369c-0b60-4702-a6d9-7818a562f677","Type":"ContainerStarted","Data":"9cd6b2f48378e96e2e17aef63f6f5fa68a78936a89db80387152fc1c8d43652e"} Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.528342 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b796967d9-trff5" event={"ID":"03409e82-9b6d-43ee-a770-96700e162fac","Type":"ContainerStarted","Data":"9c45ed31349fcbc34f71f327704448f3b575096dc3e4de498a1aacf21a75d9ce"} Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.534442 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" event={"ID":"d45bb639-d116-4666-8aea-ba5bc8ca84ea","Type":"ContainerStarted","Data":"774afb210fbe7696ef25a82ccb55c7b1437bcab50a5a9101182dc832b97cba28"} Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.539659 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5746494d7d-54slg" event={"ID":"2ff9a258-1c0d-4003-8261-8664a42d0091","Type":"ContainerStarted","Data":"f0849769cded9b27f9870a134ae92c617cd6d215232728c70c6c26c2301208c9"} Nov 22 09:32:33 crc kubenswrapper[4846]: I1122 09:32:33.539960 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5746494d7d-54slg" event={"ID":"2ff9a258-1c0d-4003-8261-8664a42d0091","Type":"ContainerStarted","Data":"f1f6277a3d70ea1b09ad2fcf689b4c493a294e0b8306771b062fb4687713589d"} Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.553410 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" event={"ID":"ddba369c-0b60-4702-a6d9-7818a562f677","Type":"ContainerStarted","Data":"ce9f63f4ec244bbc358baccf5d1d4eefcf8165abb43be4ff9ce4591430dda60b"} Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.553846 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.557433 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5746494d7d-54slg" event={"ID":"2ff9a258-1c0d-4003-8261-8664a42d0091","Type":"ContainerStarted","Data":"25f356cf8d41310ff44d2ab3490dbe65c59b831533954846c04c2e5b1a6f2bcc"} Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.580434 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" podStartSLOduration=3.580415487 podStartE2EDuration="3.580415487s" podCreationTimestamp="2025-11-22 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:34.572442944 +0000 UTC m=+1129.508132603" watchObservedRunningTime="2025-11-22 09:32:34.580415487 +0000 UTC m=+1129.516105146" Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.595986 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5746494d7d-54slg" podStartSLOduration=3.595966463 podStartE2EDuration="3.595966463s" podCreationTimestamp="2025-11-22 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:34.594745457 +0000 UTC m=+1129.530435126" watchObservedRunningTime="2025-11-22 09:32:34.595966463 +0000 UTC m=+1129.531656122" Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.925795 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55fdfc87fd-75r6l"] Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.928271 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.932199 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.932449 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 22 09:32:34 crc kubenswrapper[4846]: I1122 09:32:34.945600 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55fdfc87fd-75r6l"] Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.028662 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54092b40-6b71-4920-b703-b6b44e0e2331-logs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.028727 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-internal-tls-certs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.028995 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-combined-ca-bundle\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.029102 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-public-tls-certs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.029165 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpl59\" (UniqueName: \"kubernetes.io/projected/54092b40-6b71-4920-b703-b6b44e0e2331-kube-api-access-qpl59\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.029250 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-config-data-custom\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.029715 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-config-data\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131244 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54092b40-6b71-4920-b703-b6b44e0e2331-logs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131292 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-internal-tls-certs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131352 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-combined-ca-bundle\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131384 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-public-tls-certs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131410 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpl59\" (UniqueName: \"kubernetes.io/projected/54092b40-6b71-4920-b703-b6b44e0e2331-kube-api-access-qpl59\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131439 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-config-data-custom\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.131521 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-config-data\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.135720 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54092b40-6b71-4920-b703-b6b44e0e2331-logs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.141730 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-internal-tls-certs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.143738 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-config-data\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.143971 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-config-data-custom\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.145649 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-public-tls-certs\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.145660 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54092b40-6b71-4920-b703-b6b44e0e2331-combined-ca-bundle\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.160463 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpl59\" (UniqueName: \"kubernetes.io/projected/54092b40-6b71-4920-b703-b6b44e0e2331-kube-api-access-qpl59\") pod \"barbican-api-55fdfc87fd-75r6l\" (UID: \"54092b40-6b71-4920-b703-b6b44e0e2331\") " pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.283932 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.555360 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgp99" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.570899 4846 generic.go:334] "Generic (PLEG): container finished" podID="584aeb0f-b1a9-4a6e-b129-b21593065b18" containerID="3750c0f79fd15b824c4a981dda2963917b77fa7732dee70a171168e51b897084" exitCode=0 Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.570975 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9m5n9" event={"ID":"584aeb0f-b1a9-4a6e-b129-b21593065b18","Type":"ContainerDied","Data":"3750c0f79fd15b824c4a981dda2963917b77fa7732dee70a171168e51b897084"} Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.574385 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgp99" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.574527 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgp99" event={"ID":"083da0b8-38d6-4eab-b211-8389df97a0a8","Type":"ContainerDied","Data":"a97d32c8e6aa0d4cd658af678eaddc8fda63ec72ff34d23471f5974587c31c09"} Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.574550 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97d32c8e6aa0d4cd658af678eaddc8fda63ec72ff34d23471f5974587c31c09" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.574950 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.574980 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646156 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/083da0b8-38d6-4eab-b211-8389df97a0a8-etc-machine-id\") pod \"083da0b8-38d6-4eab-b211-8389df97a0a8\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646362 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-config-data\") pod \"083da0b8-38d6-4eab-b211-8389df97a0a8\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646393 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-combined-ca-bundle\") pod \"083da0b8-38d6-4eab-b211-8389df97a0a8\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646446 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hczs\" (UniqueName: \"kubernetes.io/projected/083da0b8-38d6-4eab-b211-8389df97a0a8-kube-api-access-9hczs\") pod \"083da0b8-38d6-4eab-b211-8389df97a0a8\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646477 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-db-sync-config-data\") pod \"083da0b8-38d6-4eab-b211-8389df97a0a8\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646526 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-scripts\") pod \"083da0b8-38d6-4eab-b211-8389df97a0a8\" (UID: \"083da0b8-38d6-4eab-b211-8389df97a0a8\") " Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.646609 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/083da0b8-38d6-4eab-b211-8389df97a0a8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "083da0b8-38d6-4eab-b211-8389df97a0a8" (UID: "083da0b8-38d6-4eab-b211-8389df97a0a8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.647884 4846 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/083da0b8-38d6-4eab-b211-8389df97a0a8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.702180 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "083da0b8-38d6-4eab-b211-8389df97a0a8" (UID: "083da0b8-38d6-4eab-b211-8389df97a0a8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.702314 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-scripts" (OuterVolumeSpecName: "scripts") pod "083da0b8-38d6-4eab-b211-8389df97a0a8" (UID: "083da0b8-38d6-4eab-b211-8389df97a0a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.702376 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/083da0b8-38d6-4eab-b211-8389df97a0a8-kube-api-access-9hczs" (OuterVolumeSpecName: "kube-api-access-9hczs") pod "083da0b8-38d6-4eab-b211-8389df97a0a8" (UID: "083da0b8-38d6-4eab-b211-8389df97a0a8"). InnerVolumeSpecName "kube-api-access-9hczs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.717806 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "083da0b8-38d6-4eab-b211-8389df97a0a8" (UID: "083da0b8-38d6-4eab-b211-8389df97a0a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.749863 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.749901 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hczs\" (UniqueName: \"kubernetes.io/projected/083da0b8-38d6-4eab-b211-8389df97a0a8-kube-api-access-9hczs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.749915 4846 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.749929 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.800408 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-config-data" (OuterVolumeSpecName: "config-data") pod "083da0b8-38d6-4eab-b211-8389df97a0a8" (UID: "083da0b8-38d6-4eab-b211-8389df97a0a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.833540 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55fdfc87fd-75r6l"] Nov 22 09:32:35 crc kubenswrapper[4846]: I1122 09:32:35.853673 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/083da0b8-38d6-4eab-b211-8389df97a0a8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.598776 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55fdfc87fd-75r6l" event={"ID":"54092b40-6b71-4920-b703-b6b44e0e2331","Type":"ContainerStarted","Data":"042f24d6d3d0a9b15673360e32903020deed96993ee93f9952eef285477c3bb7"} Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.601456 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55fdfc87fd-75r6l" event={"ID":"54092b40-6b71-4920-b703-b6b44e0e2331","Type":"ContainerStarted","Data":"4a7368917363906b01ad02d8642e8c71003bf9b6aa74ce061299ec52dac666bc"} Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.605913 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" event={"ID":"d45bb639-d116-4666-8aea-ba5bc8ca84ea","Type":"ContainerStarted","Data":"b17ff0930e9bd5c4433989e5e0cacb2949cf1bf57fcb24f4d93907e0871ab995"} Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.606111 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" event={"ID":"d45bb639-d116-4666-8aea-ba5bc8ca84ea","Type":"ContainerStarted","Data":"eceaa26b175ea966e55998cc4e64ec4ffc0abceeffde21879de958a12099c56d"} Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.609084 4846 generic.go:334] "Generic (PLEG): container finished" podID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerID="b38ada579bde7c762ec25c3cfe135c3b599a86375422c7b9dcf9c19adb47d72f" exitCode=0 Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.609150 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerDied","Data":"b38ada579bde7c762ec25c3cfe135c3b599a86375422c7b9dcf9c19adb47d72f"} Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.636066 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b796967d9-trff5" event={"ID":"03409e82-9b6d-43ee-a770-96700e162fac","Type":"ContainerStarted","Data":"d8b5b9acbcfd14a529434b225471391bcb01e30e456ff4099ca36691d8a62f53"} Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.705508 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5ff9f749db-lj4qc" podStartSLOduration=3.53208634 podStartE2EDuration="5.705480384s" podCreationTimestamp="2025-11-22 09:32:31 +0000 UTC" firstStartedPulling="2025-11-22 09:32:33.008842275 +0000 UTC m=+1127.944531924" lastFinishedPulling="2025-11-22 09:32:35.182236319 +0000 UTC m=+1130.117925968" observedRunningTime="2025-11-22 09:32:36.642422167 +0000 UTC m=+1131.578111816" watchObservedRunningTime="2025-11-22 09:32:36.705480384 +0000 UTC m=+1131.641170033" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.717201 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5b796967d9-trff5" podStartSLOduration=3.582798956 podStartE2EDuration="5.717179777s" podCreationTimestamp="2025-11-22 09:32:31 +0000 UTC" firstStartedPulling="2025-11-22 09:32:33.047824817 +0000 UTC m=+1127.983514466" lastFinishedPulling="2025-11-22 09:32:35.182205638 +0000 UTC m=+1130.117895287" observedRunningTime="2025-11-22 09:32:36.673446126 +0000 UTC m=+1131.609135785" watchObservedRunningTime="2025-11-22 09:32:36.717179777 +0000 UTC m=+1131.652869426" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.731109 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883147 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-config-data\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883238 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-combined-ca-bundle\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883288 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-run-httpd\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883345 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlljj\" (UniqueName: \"kubernetes.io/projected/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-kube-api-access-vlljj\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883459 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-scripts\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883485 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-sg-core-conf-yaml\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.883513 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-log-httpd\") pod \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\" (UID: \"bff9dfbc-88dc-4ecc-95f3-4eac40350d97\") " Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.884625 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.889387 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.913324 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-scripts" (OuterVolumeSpecName: "scripts") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.914968 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:36 crc kubenswrapper[4846]: E1122 09:32:36.915455 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="proxy-httpd" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915469 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="proxy-httpd" Nov 22 09:32:36 crc kubenswrapper[4846]: E1122 09:32:36.915492 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="sg-core" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915498 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="sg-core" Nov 22 09:32:36 crc kubenswrapper[4846]: E1122 09:32:36.915508 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="ceilometer-notification-agent" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915514 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="ceilometer-notification-agent" Nov 22 09:32:36 crc kubenswrapper[4846]: E1122 09:32:36.915538 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="083da0b8-38d6-4eab-b211-8389df97a0a8" containerName="cinder-db-sync" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915543 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="083da0b8-38d6-4eab-b211-8389df97a0a8" containerName="cinder-db-sync" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915565 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-kube-api-access-vlljj" (OuterVolumeSpecName: "kube-api-access-vlljj") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "kube-api-access-vlljj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915715 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="proxy-httpd" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915726 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="ceilometer-notification-agent" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915742 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" containerName="sg-core" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.915752 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="083da0b8-38d6-4eab-b211-8389df97a0a8" containerName="cinder-db-sync" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.916797 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.920170 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.920429 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.920984 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zqsc4" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.935742 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.957424 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.991039 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.991111 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lfjs\" (UniqueName: \"kubernetes.io/projected/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-kube-api-access-6lfjs\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.991165 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.991207 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-scripts\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.991287 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.991336 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.999548 4846 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.999620 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlljj\" (UniqueName: \"kubernetes.io/projected/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-kube-api-access-vlljj\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.999631 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:36 crc kubenswrapper[4846]: I1122 09:32:36.999642 4846 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.005327 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.062341 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-htqsr"] Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.063193 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" containerName="dnsmasq-dns" containerID="cri-o://ce9f63f4ec244bbc358baccf5d1d4eefcf8165abb43be4ff9ce4591430dda60b" gracePeriod=10 Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102236 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-scripts\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102333 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102377 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102424 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102446 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lfjs\" (UniqueName: \"kubernetes.io/projected/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-kube-api-access-6lfjs\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102483 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.102537 4846 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.108021 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.108021 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-k5lzf"] Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.109733 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.111770 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.127067 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-scripts\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.132359 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.138085 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-k5lzf"] Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.141712 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.142566 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lfjs\" (UniqueName: \"kubernetes.io/projected/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-kube-api-access-6lfjs\") pod \"cinder-scheduler-0\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.161934 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-config-data" (OuterVolumeSpecName: "config-data") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.168244 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bff9dfbc-88dc-4ecc-95f3-4eac40350d97" (UID: "bff9dfbc-88dc-4ecc-95f3-4eac40350d97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.204680 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.204842 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.204898 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.204964 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-config\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.204994 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.205093 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dntb\" (UniqueName: \"kubernetes.io/projected/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-kube-api-access-8dntb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.208350 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.208380 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff9dfbc-88dc-4ecc-95f3-4eac40350d97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.257406 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.259281 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.265238 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.270202 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.277905 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.328277 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.328324 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.328360 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-config\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.328382 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.328411 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dntb\" (UniqueName: \"kubernetes.io/projected/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-kube-api-access-8dntb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.328485 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.329403 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.329716 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.330097 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-config\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.330330 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.330629 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.360600 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dntb\" (UniqueName: \"kubernetes.io/projected/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-kube-api-access-8dntb\") pod \"dnsmasq-dns-795f4db4bc-k5lzf\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.431959 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.432431 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.432479 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data-custom\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.432561 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-scripts\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.432595 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-logs\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.432659 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmv4\" (UniqueName: \"kubernetes.io/projected/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-kube-api-access-4tmv4\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.432682 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.494220 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.530246 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535144 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-scripts\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535190 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-logs\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535245 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmv4\" (UniqueName: \"kubernetes.io/projected/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-kube-api-access-4tmv4\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535269 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535306 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535335 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.535366 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data-custom\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.536647 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.537178 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-logs\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.543627 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data-custom\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.549137 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.551287 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.559677 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-scripts\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.568747 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmv4\" (UniqueName: \"kubernetes.io/projected/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-kube-api-access-4tmv4\") pod \"cinder-api-0\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.637717 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldgsg\" (UniqueName: \"kubernetes.io/projected/584aeb0f-b1a9-4a6e-b129-b21593065b18-kube-api-access-ldgsg\") pod \"584aeb0f-b1a9-4a6e-b129-b21593065b18\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.637855 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-config\") pod \"584aeb0f-b1a9-4a6e-b129-b21593065b18\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.637983 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-combined-ca-bundle\") pod \"584aeb0f-b1a9-4a6e-b129-b21593065b18\" (UID: \"584aeb0f-b1a9-4a6e-b129-b21593065b18\") " Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.662617 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584aeb0f-b1a9-4a6e-b129-b21593065b18-kube-api-access-ldgsg" (OuterVolumeSpecName: "kube-api-access-ldgsg") pod "584aeb0f-b1a9-4a6e-b129-b21593065b18" (UID: "584aeb0f-b1a9-4a6e-b129-b21593065b18"). InnerVolumeSpecName "kube-api-access-ldgsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.697334 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b796967d9-trff5" event={"ID":"03409e82-9b6d-43ee-a770-96700e162fac","Type":"ContainerStarted","Data":"caca342e031ce6db64b2d4314e0ccdaae7f28adb8b3d71af8794e31fd3d7fe1e"} Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.716497 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55fdfc87fd-75r6l" event={"ID":"54092b40-6b71-4920-b703-b6b44e0e2331","Type":"ContainerStarted","Data":"9c58a6e1695fc0a17e1493340e40faee493ddfb597088bb5dda1b0e69b851b56"} Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.722526 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.722921 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.746848 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldgsg\" (UniqueName: \"kubernetes.io/projected/584aeb0f-b1a9-4a6e-b129-b21593065b18-kube-api-access-ldgsg\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.766420 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9m5n9" event={"ID":"584aeb0f-b1a9-4a6e-b129-b21593065b18","Type":"ContainerDied","Data":"01cc4512fd58cbfe50ee535a46de94c62ee10d68ebc52f87bd58f3efac96ad36"} Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.766460 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01cc4512fd58cbfe50ee535a46de94c62ee10d68ebc52f87bd58f3efac96ad36" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.766548 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.785273 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55fdfc87fd-75r6l" podStartSLOduration=3.785250258 podStartE2EDuration="3.785250258s" podCreationTimestamp="2025-11-22 09:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:37.762540953 +0000 UTC m=+1132.698230602" watchObservedRunningTime="2025-11-22 09:32:37.785250258 +0000 UTC m=+1132.720939907" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.815001 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bff9dfbc-88dc-4ecc-95f3-4eac40350d97","Type":"ContainerDied","Data":"2597f58413f7b206c485f5b1a293f17ab41e1a50b2325ee7aeb2aa3474fbeecb"} Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.819322 4846 scope.go:117] "RemoveContainer" containerID="857e8b9de54bbc45a991eec7cc8d37c9ec5c159ad9fe02fe5c8c166dd67d35ac" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.818857 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.820320 4846 generic.go:334] "Generic (PLEG): container finished" podID="ddba369c-0b60-4702-a6d9-7818a562f677" containerID="ce9f63f4ec244bbc358baccf5d1d4eefcf8165abb43be4ff9ce4591430dda60b" exitCode=0 Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.820727 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" event={"ID":"ddba369c-0b60-4702-a6d9-7818a562f677","Type":"ContainerDied","Data":"ce9f63f4ec244bbc358baccf5d1d4eefcf8165abb43be4ff9ce4591430dda60b"} Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.815785 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.821739 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-config" (OuterVolumeSpecName: "config") pod "584aeb0f-b1a9-4a6e-b129-b21593065b18" (UID: "584aeb0f-b1a9-4a6e-b129-b21593065b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.853809 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.882590 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "584aeb0f-b1a9-4a6e-b129-b21593065b18" (UID: "584aeb0f-b1a9-4a6e-b129-b21593065b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.925135 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-k5lzf"] Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.963734 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584aeb0f-b1a9-4a6e-b129-b21593065b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.984142 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fpp2t"] Nov 22 09:32:37 crc kubenswrapper[4846]: E1122 09:32:37.985184 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584aeb0f-b1a9-4a6e-b129-b21593065b18" containerName="neutron-db-sync" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.985202 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="584aeb0f-b1a9-4a6e-b129-b21593065b18" containerName="neutron-db-sync" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.985752 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="584aeb0f-b1a9-4a6e-b129-b21593065b18" containerName="neutron-db-sync" Nov 22 09:32:37 crc kubenswrapper[4846]: I1122 09:32:37.992450 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.037869 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57dccfc6dd-fnk6j"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.039768 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.043607 4846 scope.go:117] "RemoveContainer" containerID="501a43069e6df8f9e9414e97cd51203879e497e2c20e22470f604dadaddf2ed6" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.047987 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.071273 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqq4d\" (UniqueName: \"kubernetes.io/projected/ee773f3f-4677-4ceb-957f-a7c1743688a3-kube-api-access-dqq4d\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.071424 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.071651 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-config\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.071789 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.071814 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.071923 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.106619 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fpp2t"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.106659 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57dccfc6dd-fnk6j"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.111254 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.120270 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.124887 4846 scope.go:117] "RemoveContainer" containerID="b38ada579bde7c762ec25c3cfe135c3b599a86375422c7b9dcf9c19adb47d72f" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.146097 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.153522 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.155304 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.156900 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.173581 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174456 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-config\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174563 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-combined-ca-bundle\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174661 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174730 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174827 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-ovndb-tls-certs\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174903 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-httpd-config\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.174996 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.175106 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqq4d\" (UniqueName: \"kubernetes.io/projected/ee773f3f-4677-4ceb-957f-a7c1743688a3-kube-api-access-dqq4d\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.175211 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-config\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.175280 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.175367 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpn8n\" (UniqueName: \"kubernetes.io/projected/4c879ac4-a859-4369-82eb-fc980f7a2881-kube-api-access-fpn8n\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.176436 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-config\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.177388 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.177743 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.177923 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.178361 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.209841 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqq4d\" (UniqueName: \"kubernetes.io/projected/ee773f3f-4677-4ceb-957f-a7c1743688a3-kube-api-access-dqq4d\") pod \"dnsmasq-dns-5c9776ccc5-fpp2t\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.215826 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.279885 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ndp9\" (UniqueName: \"kubernetes.io/projected/ddba369c-0b60-4702-a6d9-7818a562f677-kube-api-access-8ndp9\") pod \"ddba369c-0b60-4702-a6d9-7818a562f677\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.279947 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-svc\") pod \"ddba369c-0b60-4702-a6d9-7818a562f677\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.280091 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-config\") pod \"ddba369c-0b60-4702-a6d9-7818a562f677\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.280189 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-swift-storage-0\") pod \"ddba369c-0b60-4702-a6d9-7818a562f677\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.280356 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-nb\") pod \"ddba369c-0b60-4702-a6d9-7818a562f677\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.280393 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-sb\") pod \"ddba369c-0b60-4702-a6d9-7818a562f677\" (UID: \"ddba369c-0b60-4702-a6d9-7818a562f677\") " Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.280889 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-config\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281292 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpn8n\" (UniqueName: \"kubernetes.io/projected/4c879ac4-a859-4369-82eb-fc980f7a2881-kube-api-access-fpn8n\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281351 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-config-data\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281379 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-run-httpd\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281493 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-combined-ca-bundle\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281558 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-ovndb-tls-certs\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281586 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-httpd-config\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281615 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2ftr\" (UniqueName: \"kubernetes.io/projected/350dff5e-52cf-4530-9527-46f8c8dc3487-kube-api-access-x2ftr\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281644 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-scripts\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281698 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281728 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.281751 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-log-httpd\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.300316 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddba369c-0b60-4702-a6d9-7818a562f677-kube-api-access-8ndp9" (OuterVolumeSpecName: "kube-api-access-8ndp9") pod "ddba369c-0b60-4702-a6d9-7818a562f677" (UID: "ddba369c-0b60-4702-a6d9-7818a562f677"). InnerVolumeSpecName "kube-api-access-8ndp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.309886 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-ovndb-tls-certs\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.310265 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.325079 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-combined-ca-bundle\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.381814 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.382926 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-config\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384542 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-config-data\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384576 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-run-httpd\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384668 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2ftr\" (UniqueName: \"kubernetes.io/projected/350dff5e-52cf-4530-9527-46f8c8dc3487-kube-api-access-x2ftr\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384692 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-scripts\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384733 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384754 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384774 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-log-httpd\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.384865 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ndp9\" (UniqueName: \"kubernetes.io/projected/ddba369c-0b60-4702-a6d9-7818a562f677-kube-api-access-8ndp9\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.404274 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-log-httpd\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.409466 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-run-httpd\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.409543 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.426879 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-scripts\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.427852 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-config-data\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.437654 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpn8n\" (UniqueName: \"kubernetes.io/projected/4c879ac4-a859-4369-82eb-fc980f7a2881-kube-api-access-fpn8n\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.438097 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-httpd-config\") pod \"neutron-57dccfc6dd-fnk6j\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.438295 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.447924 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-k5lzf"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.454146 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.459185 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2ftr\" (UniqueName: \"kubernetes.io/projected/350dff5e-52cf-4530-9527-46f8c8dc3487-kube-api-access-x2ftr\") pod \"ceilometer-0\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.603703 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.633600 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.687744 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ddba369c-0b60-4702-a6d9-7818a562f677" (UID: "ddba369c-0b60-4702-a6d9-7818a562f677"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.700489 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.711762 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddba369c-0b60-4702-a6d9-7818a562f677" (UID: "ddba369c-0b60-4702-a6d9-7818a562f677"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.716189 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:32:38 crc kubenswrapper[4846]: W1122 09:32:38.727215 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f51fe06_60dd_4c64_91f9_3ecbf14ca6c1.slice/crio-134257b4a511c69f629f167844d3e63e1f788b11ad76ffe9c99879dc2c7b3759 WatchSource:0}: Error finding container 134257b4a511c69f629f167844d3e63e1f788b11ad76ffe9c99879dc2c7b3759: Status 404 returned error can't find the container with id 134257b4a511c69f629f167844d3e63e1f788b11ad76ffe9c99879dc2c7b3759 Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.731430 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-config" (OuterVolumeSpecName: "config") pod "ddba369c-0b60-4702-a6d9-7818a562f677" (UID: "ddba369c-0b60-4702-a6d9-7818a562f677"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.732324 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ddba369c-0b60-4702-a6d9-7818a562f677" (UID: "ddba369c-0b60-4702-a6d9-7818a562f677"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.778732 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ddba369c-0b60-4702-a6d9-7818a562f677" (UID: "ddba369c-0b60-4702-a6d9-7818a562f677"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.805799 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.805829 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.805839 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.805850 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddba369c-0b60-4702-a6d9-7818a562f677-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.879546 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96c0f5ab-1d78-4937-bbc9-cafa758ebf56","Type":"ContainerStarted","Data":"585766e7a6d24707753e17b143fc52ea7bd18bdccb5d49fe68576da21db0eebd"} Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.895268 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" event={"ID":"2dd727ed-a2cd-49c2-b38f-2f38ba88218c","Type":"ContainerStarted","Data":"bb6f51d528cef61d71fe0c91da7b17f6860bcde58229384ac750e24305313ade"} Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.920396 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" event={"ID":"ddba369c-0b60-4702-a6d9-7818a562f677","Type":"ContainerDied","Data":"9cd6b2f48378e96e2e17aef63f6f5fa68a78936a89db80387152fc1c8d43652e"} Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.920465 4846 scope.go:117] "RemoveContainer" containerID="ce9f63f4ec244bbc358baccf5d1d4eefcf8165abb43be4ff9ce4591430dda60b" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.920571 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-htqsr" Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.931941 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1","Type":"ContainerStarted","Data":"134257b4a511c69f629f167844d3e63e1f788b11ad76ffe9c99879dc2c7b3759"} Nov 22 09:32:38 crc kubenswrapper[4846]: I1122 09:32:38.991315 4846 scope.go:117] "RemoveContainer" containerID="d248bd88abc8308bb2cb2eaf91f43daf27656501bc157b2070b5efeed3598732" Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.060703 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-htqsr"] Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.088618 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-htqsr"] Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.138754 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fpp2t"] Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.527968 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.609631 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57dccfc6dd-fnk6j"] Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.933730 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.980870 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerStarted","Data":"44ff705219db0167bcbd9e00d2956036afc459c3acb50e85aa5c19115a0411b7"} Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.997740 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" event={"ID":"ee773f3f-4677-4ceb-957f-a7c1743688a3","Type":"ContainerStarted","Data":"9cb7591cb1576a301fae105d58e02ce4eeb6a551e1cb580f61cb55e2d62a74e2"} Nov 22 09:32:39 crc kubenswrapper[4846]: I1122 09:32:39.997790 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" event={"ID":"ee773f3f-4677-4ceb-957f-a7c1743688a3","Type":"ContainerStarted","Data":"b55984033fd72ca94b8d4e6dca48704e261e1053dd768cb1aab3c6a47e5cb424"} Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.007331 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57dccfc6dd-fnk6j" event={"ID":"4c879ac4-a859-4369-82eb-fc980f7a2881","Type":"ContainerStarted","Data":"306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586"} Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.007411 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57dccfc6dd-fnk6j" event={"ID":"4c879ac4-a859-4369-82eb-fc980f7a2881","Type":"ContainerStarted","Data":"55bb3373036b55fd2070f540addc15bc7e58eddc07597d1447d91b376b4caac9"} Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.013280 4846 generic.go:334] "Generic (PLEG): container finished" podID="2dd727ed-a2cd-49c2-b38f-2f38ba88218c" containerID="04381de958741b1cd522bab3c2526cc73d1843a837bb924c29f89e40923ea430" exitCode=0 Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.013712 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" event={"ID":"2dd727ed-a2cd-49c2-b38f-2f38ba88218c","Type":"ContainerDied","Data":"04381de958741b1cd522bab3c2526cc73d1843a837bb924c29f89e40923ea430"} Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.072865 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff9dfbc-88dc-4ecc-95f3-4eac40350d97" path="/var/lib/kubelet/pods/bff9dfbc-88dc-4ecc-95f3-4eac40350d97/volumes" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.078406 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" path="/var/lib/kubelet/pods/ddba369c-0b60-4702-a6d9-7818a562f677/volumes" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.137312 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.644683 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.793565 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-config\") pod \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.793622 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-nb\") pod \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.793674 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-swift-storage-0\") pod \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.793769 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dntb\" (UniqueName: \"kubernetes.io/projected/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-kube-api-access-8dntb\") pod \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.793845 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-svc\") pod \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.793944 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-sb\") pod \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\" (UID: \"2dd727ed-a2cd-49c2-b38f-2f38ba88218c\") " Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.810554 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-kube-api-access-8dntb" (OuterVolumeSpecName: "kube-api-access-8dntb") pod "2dd727ed-a2cd-49c2-b38f-2f38ba88218c" (UID: "2dd727ed-a2cd-49c2-b38f-2f38ba88218c"). InnerVolumeSpecName "kube-api-access-8dntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.829090 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2dd727ed-a2cd-49c2-b38f-2f38ba88218c" (UID: "2dd727ed-a2cd-49c2-b38f-2f38ba88218c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.839169 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-config" (OuterVolumeSpecName: "config") pod "2dd727ed-a2cd-49c2-b38f-2f38ba88218c" (UID: "2dd727ed-a2cd-49c2-b38f-2f38ba88218c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.840921 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2dd727ed-a2cd-49c2-b38f-2f38ba88218c" (UID: "2dd727ed-a2cd-49c2-b38f-2f38ba88218c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.849725 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2dd727ed-a2cd-49c2-b38f-2f38ba88218c" (UID: "2dd727ed-a2cd-49c2-b38f-2f38ba88218c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.849836 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2dd727ed-a2cd-49c2-b38f-2f38ba88218c" (UID: "2dd727ed-a2cd-49c2-b38f-2f38ba88218c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.898152 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.898211 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.898229 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.898246 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dntb\" (UniqueName: \"kubernetes.io/projected/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-kube-api-access-8dntb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.898282 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:40 crc kubenswrapper[4846]: I1122 09:32:40.898294 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dd727ed-a2cd-49c2-b38f-2f38ba88218c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.032405 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerID="9cb7591cb1576a301fae105d58e02ce4eeb6a551e1cb580f61cb55e2d62a74e2" exitCode=0 Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.032472 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" event={"ID":"ee773f3f-4677-4ceb-957f-a7c1743688a3","Type":"ContainerDied","Data":"9cb7591cb1576a301fae105d58e02ce4eeb6a551e1cb580f61cb55e2d62a74e2"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.032498 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" event={"ID":"ee773f3f-4677-4ceb-957f-a7c1743688a3","Type":"ContainerStarted","Data":"56769491f945c967a64c85ac8dc8417e4e0cf02ef727dad109906b64c5a332dd"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.033739 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.046272 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57dccfc6dd-fnk6j" event={"ID":"4c879ac4-a859-4369-82eb-fc980f7a2881","Type":"ContainerStarted","Data":"dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.047705 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.050981 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" event={"ID":"2dd727ed-a2cd-49c2-b38f-2f38ba88218c","Type":"ContainerDied","Data":"bb6f51d528cef61d71fe0c91da7b17f6860bcde58229384ac750e24305313ade"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.053504 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.053551 4846 scope.go:117] "RemoveContainer" containerID="04381de958741b1cd522bab3c2526cc73d1843a837bb924c29f89e40923ea430" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.051076 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-k5lzf" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.061752 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" podStartSLOduration=4.061733788 podStartE2EDuration="4.061733788s" podCreationTimestamp="2025-11-22 09:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:41.060310506 +0000 UTC m=+1135.996000155" watchObservedRunningTime="2025-11-22 09:32:41.061733788 +0000 UTC m=+1135.997423437" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.075994 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1","Type":"ContainerStarted","Data":"ac974dba1e218351d3a65895634ca56193bae102d7afec608731d5b179654883"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.103526 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57dccfc6dd-fnk6j" podStartSLOduration=4.103500382 podStartE2EDuration="4.103500382s" podCreationTimestamp="2025-11-22 09:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:41.087112302 +0000 UTC m=+1136.022801941" watchObservedRunningTime="2025-11-22 09:32:41.103500382 +0000 UTC m=+1136.039190031" Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.104594 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerStarted","Data":"6061f65561a394ed9574c6f0411253104dc193c1b1bd52e82a200ac29c97d9fc"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.126501 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96c0f5ab-1d78-4937-bbc9-cafa758ebf56","Type":"ContainerStarted","Data":"361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6"} Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.173514 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-k5lzf"] Nov 22 09:32:41 crc kubenswrapper[4846]: I1122 09:32:41.192493 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-k5lzf"] Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.089920 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd727ed-a2cd-49c2-b38f-2f38ba88218c" path="/var/lib/kubelet/pods/2dd727ed-a2cd-49c2-b38f-2f38ba88218c/volumes" Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.137399 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1","Type":"ContainerStarted","Data":"0a015f218c4f1d3f77870e4e113e72750cc8269d360fecabce39638daff75cb4"} Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.137904 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.138019 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api-log" containerID="cri-o://ac974dba1e218351d3a65895634ca56193bae102d7afec608731d5b179654883" gracePeriod=30 Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.138113 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api" containerID="cri-o://0a015f218c4f1d3f77870e4e113e72750cc8269d360fecabce39638daff75cb4" gracePeriod=30 Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.139509 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96c0f5ab-1d78-4937-bbc9-cafa758ebf56","Type":"ContainerStarted","Data":"ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69"} Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.177577 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.177553948 podStartE2EDuration="5.177553948s" podCreationTimestamp="2025-11-22 09:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:42.161420235 +0000 UTC m=+1137.097109884" watchObservedRunningTime="2025-11-22 09:32:42.177553948 +0000 UTC m=+1137.113243597" Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.199864 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.039213629 podStartE2EDuration="6.199841471s" podCreationTimestamp="2025-11-22 09:32:36 +0000 UTC" firstStartedPulling="2025-11-22 09:32:38.45431155 +0000 UTC m=+1133.390001199" lastFinishedPulling="2025-11-22 09:32:39.614939392 +0000 UTC m=+1134.550629041" observedRunningTime="2025-11-22 09:32:42.194758142 +0000 UTC m=+1137.130447791" watchObservedRunningTime="2025-11-22 09:32:42.199841471 +0000 UTC m=+1137.135531140" Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.281150 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.691776 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5dfd5ccb4b-fpl7v" Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.791152 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8564f79874-c88vw"] Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.795857 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon-log" containerID="cri-o://27001cb788644216b3d4184a41d7a2c4b77ab48f03fe634319d55ff847ebacea" gracePeriod=30 Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.796354 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" containerID="cri-o://5a55e9de128b8e237f5cd8fabafd175065c25d630198e7d28d8c6d6779e35778" gracePeriod=30 Nov 22 09:32:42 crc kubenswrapper[4846]: I1122 09:32:42.809347 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Nov 22 09:32:43 crc kubenswrapper[4846]: I1122 09:32:43.160733 4846 generic.go:334] "Generic (PLEG): container finished" podID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerID="0a015f218c4f1d3f77870e4e113e72750cc8269d360fecabce39638daff75cb4" exitCode=0 Nov 22 09:32:43 crc kubenswrapper[4846]: I1122 09:32:43.160781 4846 generic.go:334] "Generic (PLEG): container finished" podID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerID="ac974dba1e218351d3a65895634ca56193bae102d7afec608731d5b179654883" exitCode=143 Nov 22 09:32:43 crc kubenswrapper[4846]: I1122 09:32:43.160784 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1","Type":"ContainerDied","Data":"0a015f218c4f1d3f77870e4e113e72750cc8269d360fecabce39638daff75cb4"} Nov 22 09:32:43 crc kubenswrapper[4846]: I1122 09:32:43.160837 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1","Type":"ContainerDied","Data":"ac974dba1e218351d3a65895634ca56193bae102d7afec608731d5b179654883"} Nov 22 09:32:43 crc kubenswrapper[4846]: I1122 09:32:43.992218 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.170807 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data-custom\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.170883 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-logs\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.170924 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-combined-ca-bundle\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.170962 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-scripts\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.171107 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-etc-machine-id\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.171251 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.171271 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tmv4\" (UniqueName: \"kubernetes.io/projected/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-kube-api-access-4tmv4\") pod \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\" (UID: \"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1\") " Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.173496 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-logs" (OuterVolumeSpecName: "logs") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.177700 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.182862 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-scripts" (OuterVolumeSpecName: "scripts") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.191719 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-kube-api-access-4tmv4" (OuterVolumeSpecName: "kube-api-access-4tmv4") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "kube-api-access-4tmv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.191840 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.214377 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1","Type":"ContainerDied","Data":"134257b4a511c69f629f167844d3e63e1f788b11ad76ffe9c99879dc2c7b3759"} Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.214502 4846 scope.go:117] "RemoveContainer" containerID="0a015f218c4f1d3f77870e4e113e72750cc8269d360fecabce39638daff75cb4" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.214686 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.219176 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.232562 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerStarted","Data":"7c64a3ee6d2066e9cfae66641815dcc2ef95419e93910d7c9a57c1b33f5d640e"} Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.249241 4846 scope.go:117] "RemoveContainer" containerID="ac974dba1e218351d3a65895634ca56193bae102d7afec608731d5b179654883" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.252252 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data" (OuterVolumeSpecName: "config-data") pod "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" (UID: "7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275340 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275374 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275385 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275396 4846 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275405 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275417 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tmv4\" (UniqueName: \"kubernetes.io/projected/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-kube-api-access-4tmv4\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.275427 4846 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.623317 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.650656 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.665835 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:44 crc kubenswrapper[4846]: E1122 09:32:44.666378 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api-log" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666402 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api-log" Nov 22 09:32:44 crc kubenswrapper[4846]: E1122 09:32:44.666423 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" containerName="dnsmasq-dns" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666433 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" containerName="dnsmasq-dns" Nov 22 09:32:44 crc kubenswrapper[4846]: E1122 09:32:44.666452 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd727ed-a2cd-49c2-b38f-2f38ba88218c" containerName="init" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666458 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd727ed-a2cd-49c2-b38f-2f38ba88218c" containerName="init" Nov 22 09:32:44 crc kubenswrapper[4846]: E1122 09:32:44.666474 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666480 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api" Nov 22 09:32:44 crc kubenswrapper[4846]: E1122 09:32:44.666513 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" containerName="init" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666522 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" containerName="init" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666744 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddba369c-0b60-4702-a6d9-7818a562f677" containerName="dnsmasq-dns" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666759 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd727ed-a2cd-49c2-b38f-2f38ba88218c" containerName="init" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666770 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api-log" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.666777 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" containerName="cinder-api" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.667876 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.681940 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.683309 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.683517 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.683576 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.784769 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.784937 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-config-data\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785102 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785253 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-logs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785285 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785346 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-scripts\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785368 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785514 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.785671 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft22w\" (UniqueName: \"kubernetes.io/projected/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-kube-api-access-ft22w\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.887389 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.887479 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.887505 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-logs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.887549 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-scripts\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.887567 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.888204 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-logs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.888296 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.888827 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft22w\" (UniqueName: \"kubernetes.io/projected/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-kube-api-access-ft22w\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.888936 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.889018 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-config-data\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.888742 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.894494 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.895076 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-scripts\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.895684 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.914608 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-config-data\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.914694 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.917698 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.923139 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft22w\" (UniqueName: \"kubernetes.io/projected/2b86aa01-1c05-47da-9f91-ef71a5e6d7ec-kube-api-access-ft22w\") pod \"cinder-api-0\" (UID: \"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec\") " pod="openstack/cinder-api-0" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.970461 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f98fdfc57-v8bnv"] Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.974074 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.978338 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.979174 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 22 09:32:44 crc kubenswrapper[4846]: I1122 09:32:44.997996 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f98fdfc57-v8bnv"] Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.005725 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.052703 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099430 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-internal-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099525 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-config\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099545 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-public-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099564 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-ovndb-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099597 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-httpd-config\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099637 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-combined-ca-bundle\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.099658 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqcsk\" (UniqueName: \"kubernetes.io/projected/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-kube-api-access-cqcsk\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.133711 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.201960 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-internal-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.202073 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-config\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.202099 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-public-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.202123 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-ovndb-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.202154 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-httpd-config\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.202202 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-combined-ca-bundle\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.202223 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqcsk\" (UniqueName: \"kubernetes.io/projected/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-kube-api-access-cqcsk\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.212929 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-config\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.214674 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-ovndb-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.215605 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-public-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.216259 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-internal-tls-certs\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.220426 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-combined-ca-bundle\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.224416 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-httpd-config\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.227332 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqcsk\" (UniqueName: \"kubernetes.io/projected/eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4-kube-api-access-cqcsk\") pod \"neutron-7f98fdfc57-v8bnv\" (UID: \"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4\") " pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.283798 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerStarted","Data":"07bcbfdb85e5ee64deffec7247b7af6209005a4dc0f0740f924239ac5cbb73cd"} Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.417907 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:45 crc kubenswrapper[4846]: I1122 09:32:45.693626 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.202945 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1" path="/var/lib/kubelet/pods/7f51fe06-60dd-4c64-91f9-3ecbf14ca6c1/volumes" Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.284345 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:40998->10.217.0.147:8443: read: connection reset by peer" Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.285267 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.380404 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerStarted","Data":"bcc47003a42ed33cb3107ab0ddc358bc36e1a6b16345ab14def3ca462bb1d8e2"} Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.381860 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.390288 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f98fdfc57-v8bnv"] Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.402576 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec","Type":"ContainerStarted","Data":"36efaaee54672618a17c95d384dc4385897bd845e460eba624c007b5e1ccc3d7"} Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.430143 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.155107117 podStartE2EDuration="8.430119654s" podCreationTimestamp="2025-11-22 09:32:38 +0000 UTC" firstStartedPulling="2025-11-22 09:32:39.548346151 +0000 UTC m=+1134.484035800" lastFinishedPulling="2025-11-22 09:32:45.823358688 +0000 UTC m=+1140.759048337" observedRunningTime="2025-11-22 09:32:46.424674584 +0000 UTC m=+1141.360364233" watchObservedRunningTime="2025-11-22 09:32:46.430119654 +0000 UTC m=+1141.365809303" Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.438063 4846 generic.go:334] "Generic (PLEG): container finished" podID="f79042af-3413-4614-a787-72fdd7fc91d7" containerID="5a55e9de128b8e237f5cd8fabafd175065c25d630198e7d28d8c6d6779e35778" exitCode=0 Nov 22 09:32:46 crc kubenswrapper[4846]: I1122 09:32:46.438144 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8564f79874-c88vw" event={"ID":"f79042af-3413-4614-a787-72fdd7fc91d7","Type":"ContainerDied","Data":"5a55e9de128b8e237f5cd8fabafd175065c25d630198e7d28d8c6d6779e35778"} Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.454041 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec","Type":"ContainerStarted","Data":"f98891e71e50f2a51c996e46384b59866faa30212431e5bc53b646006d8d7b5b"} Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.460345 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f98fdfc57-v8bnv" event={"ID":"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4","Type":"ContainerStarted","Data":"5a700dd0f3c44d78b494c35db22e59a04e9daf5d7650930b3c1889d619f2326c"} Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.460377 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f98fdfc57-v8bnv" event={"ID":"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4","Type":"ContainerStarted","Data":"7a5d0d4867cb8086882cdc82aced196c3e7a0a51f953e4528190025b1b51049b"} Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.460389 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f98fdfc57-v8bnv" event={"ID":"eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4","Type":"ContainerStarted","Data":"d5f4b185ed1c1a1e14daea116f65921aa22b5129ee052c58a8cbadde04d2b023"} Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.461260 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.689204 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.729827 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f98fdfc57-v8bnv" podStartSLOduration=3.72980775 podStartE2EDuration="3.72980775s" podCreationTimestamp="2025-11-22 09:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:47.50451851 +0000 UTC m=+1142.440208169" watchObservedRunningTime="2025-11-22 09:32:47.72980775 +0000 UTC m=+1142.665497399" Nov 22 09:32:47 crc kubenswrapper[4846]: I1122 09:32:47.740745 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.109757 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.116082 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55fdfc87fd-75r6l" Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.193177 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5746494d7d-54slg"] Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.193530 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5746494d7d-54slg" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api-log" containerID="cri-o://f0849769cded9b27f9870a134ae92c617cd6d215232728c70c6c26c2301208c9" gracePeriod=30 Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.193655 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5746494d7d-54slg" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api" containerID="cri-o://25f356cf8d41310ff44d2ab3490dbe65c59b831533954846c04c2e5b1a6f2bcc" gracePeriod=30 Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.391229 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.474642 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-vz6qx"] Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.474937 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" podUID="d9550585-5ddb-45d1-9471-884d030282fb" containerName="dnsmasq-dns" containerID="cri-o://a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99" gracePeriod=10 Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.489732 4846 generic.go:334] "Generic (PLEG): container finished" podID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerID="f0849769cded9b27f9870a134ae92c617cd6d215232728c70c6c26c2301208c9" exitCode=143 Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.490270 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5746494d7d-54slg" event={"ID":"2ff9a258-1c0d-4003-8261-8664a42d0091","Type":"ContainerDied","Data":"f0849769cded9b27f9870a134ae92c617cd6d215232728c70c6c26c2301208c9"} Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.500901 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2b86aa01-1c05-47da-9f91-ef71a5e6d7ec","Type":"ContainerStarted","Data":"a30ba0c90675ed40f1f275f4745e961f20da5c58272672987dd8baa04ad54d7b"} Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.500929 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="cinder-scheduler" containerID="cri-o://361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6" gracePeriod=30 Nov 22 09:32:48 crc kubenswrapper[4846]: I1122 09:32:48.501097 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="probe" containerID="cri-o://ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69" gracePeriod=30 Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.082347 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.113811 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.113785616 podStartE2EDuration="5.113785616s" podCreationTimestamp="2025-11-22 09:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:48.54354249 +0000 UTC m=+1143.479232149" watchObservedRunningTime="2025-11-22 09:32:49.113785616 +0000 UTC m=+1144.049475265" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.199388 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-config\") pod \"d9550585-5ddb-45d1-9471-884d030282fb\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.199526 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-nb\") pod \"d9550585-5ddb-45d1-9471-884d030282fb\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.199632 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-sb\") pod \"d9550585-5ddb-45d1-9471-884d030282fb\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.199652 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-svc\") pod \"d9550585-5ddb-45d1-9471-884d030282fb\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.199796 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5bc\" (UniqueName: \"kubernetes.io/projected/d9550585-5ddb-45d1-9471-884d030282fb-kube-api-access-8g5bc\") pod \"d9550585-5ddb-45d1-9471-884d030282fb\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.199840 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-swift-storage-0\") pod \"d9550585-5ddb-45d1-9471-884d030282fb\" (UID: \"d9550585-5ddb-45d1-9471-884d030282fb\") " Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.212304 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9550585-5ddb-45d1-9471-884d030282fb-kube-api-access-8g5bc" (OuterVolumeSpecName: "kube-api-access-8g5bc") pod "d9550585-5ddb-45d1-9471-884d030282fb" (UID: "d9550585-5ddb-45d1-9471-884d030282fb"). InnerVolumeSpecName "kube-api-access-8g5bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.264827 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9550585-5ddb-45d1-9471-884d030282fb" (UID: "d9550585-5ddb-45d1-9471-884d030282fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.282978 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9550585-5ddb-45d1-9471-884d030282fb" (UID: "d9550585-5ddb-45d1-9471-884d030282fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.285974 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-config" (OuterVolumeSpecName: "config") pod "d9550585-5ddb-45d1-9471-884d030282fb" (UID: "d9550585-5ddb-45d1-9471-884d030282fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.286936 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9550585-5ddb-45d1-9471-884d030282fb" (UID: "d9550585-5ddb-45d1-9471-884d030282fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.303406 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.303439 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.303452 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.303465 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.303474 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g5bc\" (UniqueName: \"kubernetes.io/projected/d9550585-5ddb-45d1-9471-884d030282fb-kube-api-access-8g5bc\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.311134 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d9550585-5ddb-45d1-9471-884d030282fb" (UID: "d9550585-5ddb-45d1-9471-884d030282fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.405692 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9550585-5ddb-45d1-9471-884d030282fb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.510710 4846 generic.go:334] "Generic (PLEG): container finished" podID="d9550585-5ddb-45d1-9471-884d030282fb" containerID="a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99" exitCode=0 Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.510773 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" event={"ID":"d9550585-5ddb-45d1-9471-884d030282fb","Type":"ContainerDied","Data":"a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99"} Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.510856 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" event={"ID":"d9550585-5ddb-45d1-9471-884d030282fb","Type":"ContainerDied","Data":"ac7974430f1484daca8518d50c9eef6743e840eb437a67b5e89a91dc7aa91c6c"} Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.510892 4846 scope.go:117] "RemoveContainer" containerID="a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.510819 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-vz6qx" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.513463 4846 generic.go:334] "Generic (PLEG): container finished" podID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerID="ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69" exitCode=0 Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.513832 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96c0f5ab-1d78-4937-bbc9-cafa758ebf56","Type":"ContainerDied","Data":"ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69"} Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.513979 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.544169 4846 scope.go:117] "RemoveContainer" containerID="83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.559827 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-vz6qx"] Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.568453 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-vz6qx"] Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.570410 4846 scope.go:117] "RemoveContainer" containerID="a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99" Nov 22 09:32:49 crc kubenswrapper[4846]: E1122 09:32:49.570949 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99\": container with ID starting with a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99 not found: ID does not exist" containerID="a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.570986 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99"} err="failed to get container status \"a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99\": rpc error: code = NotFound desc = could not find container \"a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99\": container with ID starting with a1d538aed00228549fc4c5fe20809bcbc29452f2a16d03c591f9b5c709030b99 not found: ID does not exist" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.571013 4846 scope.go:117] "RemoveContainer" containerID="83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6" Nov 22 09:32:49 crc kubenswrapper[4846]: E1122 09:32:49.571551 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6\": container with ID starting with 83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6 not found: ID does not exist" containerID="83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6" Nov 22 09:32:49 crc kubenswrapper[4846]: I1122 09:32:49.571626 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6"} err="failed to get container status \"83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6\": rpc error: code = NotFound desc = could not find container \"83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6\": container with ID starting with 83e13f3da84724ac5111f97fb0f618c53b90a28226fee357bca7acba01e866f6 not found: ID does not exist" Nov 22 09:32:50 crc kubenswrapper[4846]: I1122 09:32:50.046221 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9550585-5ddb-45d1-9471-884d030282fb" path="/var/lib/kubelet/pods/d9550585-5ddb-45d1-9471-884d030282fb/volumes" Nov 22 09:32:51 crc kubenswrapper[4846]: I1122 09:32:51.558805 4846 generic.go:334] "Generic (PLEG): container finished" podID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerID="25f356cf8d41310ff44d2ab3490dbe65c59b831533954846c04c2e5b1a6f2bcc" exitCode=0 Nov 22 09:32:51 crc kubenswrapper[4846]: I1122 09:32:51.558858 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5746494d7d-54slg" event={"ID":"2ff9a258-1c0d-4003-8261-8664a42d0091","Type":"ContainerDied","Data":"25f356cf8d41310ff44d2ab3490dbe65c59b831533954846c04c2e5b1a6f2bcc"} Nov 22 09:32:51 crc kubenswrapper[4846]: I1122 09:32:51.958332 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.103800 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff9a258-1c0d-4003-8261-8664a42d0091-logs\") pod \"2ff9a258-1c0d-4003-8261-8664a42d0091\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.103980 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-combined-ca-bundle\") pod \"2ff9a258-1c0d-4003-8261-8664a42d0091\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.104199 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgfc4\" (UniqueName: \"kubernetes.io/projected/2ff9a258-1c0d-4003-8261-8664a42d0091-kube-api-access-fgfc4\") pod \"2ff9a258-1c0d-4003-8261-8664a42d0091\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.104245 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data\") pod \"2ff9a258-1c0d-4003-8261-8664a42d0091\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.104346 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data-custom\") pod \"2ff9a258-1c0d-4003-8261-8664a42d0091\" (UID: \"2ff9a258-1c0d-4003-8261-8664a42d0091\") " Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.104885 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff9a258-1c0d-4003-8261-8664a42d0091-logs" (OuterVolumeSpecName: "logs") pod "2ff9a258-1c0d-4003-8261-8664a42d0091" (UID: "2ff9a258-1c0d-4003-8261-8664a42d0091"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.109670 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff9a258-1c0d-4003-8261-8664a42d0091-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.115654 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ff9a258-1c0d-4003-8261-8664a42d0091" (UID: "2ff9a258-1c0d-4003-8261-8664a42d0091"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.126732 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff9a258-1c0d-4003-8261-8664a42d0091-kube-api-access-fgfc4" (OuterVolumeSpecName: "kube-api-access-fgfc4") pod "2ff9a258-1c0d-4003-8261-8664a42d0091" (UID: "2ff9a258-1c0d-4003-8261-8664a42d0091"). InnerVolumeSpecName "kube-api-access-fgfc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.138382 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff9a258-1c0d-4003-8261-8664a42d0091" (UID: "2ff9a258-1c0d-4003-8261-8664a42d0091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.161006 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data" (OuterVolumeSpecName: "config-data") pod "2ff9a258-1c0d-4003-8261-8664a42d0091" (UID: "2ff9a258-1c0d-4003-8261-8664a42d0091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.212370 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgfc4\" (UniqueName: \"kubernetes.io/projected/2ff9a258-1c0d-4003-8261-8664a42d0091-kube-api-access-fgfc4\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.213062 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.213147 4846 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.213194 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff9a258-1c0d-4003-8261-8664a42d0091-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.579097 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5746494d7d-54slg" event={"ID":"2ff9a258-1c0d-4003-8261-8664a42d0091","Type":"ContainerDied","Data":"f1f6277a3d70ea1b09ad2fcf689b4c493a294e0b8306771b062fb4687713589d"} Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.579167 4846 scope.go:117] "RemoveContainer" containerID="25f356cf8d41310ff44d2ab3490dbe65c59b831533954846c04c2e5b1a6f2bcc" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.579264 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5746494d7d-54slg" Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.619901 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5746494d7d-54slg"] Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.631230 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5746494d7d-54slg"] Nov 22 09:32:52 crc kubenswrapper[4846]: I1122 09:32:52.634192 4846 scope.go:117] "RemoveContainer" containerID="f0849769cded9b27f9870a134ae92c617cd6d215232728c70c6c26c2301208c9" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.546849 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.589896 4846 generic.go:334] "Generic (PLEG): container finished" podID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerID="361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6" exitCode=0 Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.589969 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96c0f5ab-1d78-4937-bbc9-cafa758ebf56","Type":"ContainerDied","Data":"361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6"} Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.590016 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96c0f5ab-1d78-4937-bbc9-cafa758ebf56","Type":"ContainerDied","Data":"585766e7a6d24707753e17b143fc52ea7bd18bdccb5d49fe68576da21db0eebd"} Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.590071 4846 scope.go:117] "RemoveContainer" containerID="ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.590193 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.615643 4846 scope.go:117] "RemoveContainer" containerID="361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.645800 4846 scope.go:117] "RemoveContainer" containerID="ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.646559 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69\": container with ID starting with ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69 not found: ID does not exist" containerID="ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.646603 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69"} err="failed to get container status \"ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69\": rpc error: code = NotFound desc = could not find container \"ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69\": container with ID starting with ad74443a62433cff125a7468d8def26806d02c5f222031c2c35c545027a1ca69 not found: ID does not exist" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.646638 4846 scope.go:117] "RemoveContainer" containerID="361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.647257 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6\": container with ID starting with 361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6 not found: ID does not exist" containerID="361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.647286 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6"} err="failed to get container status \"361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6\": rpc error: code = NotFound desc = could not find container \"361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6\": container with ID starting with 361604dc93129ab90777ae643599ed2228c9c88b2e4d2709c69feaa2a0201bb6 not found: ID does not exist" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.745860 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lfjs\" (UniqueName: \"kubernetes.io/projected/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-kube-api-access-6lfjs\") pod \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.745939 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-combined-ca-bundle\") pod \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.746038 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data-custom\") pod \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.746801 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data\") pod \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.746828 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-scripts\") pod \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.746871 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-etc-machine-id\") pod \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\" (UID: \"96c0f5ab-1d78-4937-bbc9-cafa758ebf56\") " Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.747038 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "96c0f5ab-1d78-4937-bbc9-cafa758ebf56" (UID: "96c0f5ab-1d78-4937-bbc9-cafa758ebf56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.747931 4846 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.754845 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "96c0f5ab-1d78-4937-bbc9-cafa758ebf56" (UID: "96c0f5ab-1d78-4937-bbc9-cafa758ebf56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.754860 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-scripts" (OuterVolumeSpecName: "scripts") pod "96c0f5ab-1d78-4937-bbc9-cafa758ebf56" (UID: "96c0f5ab-1d78-4937-bbc9-cafa758ebf56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.755169 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-kube-api-access-6lfjs" (OuterVolumeSpecName: "kube-api-access-6lfjs") pod "96c0f5ab-1d78-4937-bbc9-cafa758ebf56" (UID: "96c0f5ab-1d78-4937-bbc9-cafa758ebf56"). InnerVolumeSpecName "kube-api-access-6lfjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.830468 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96c0f5ab-1d78-4937-bbc9-cafa758ebf56" (UID: "96c0f5ab-1d78-4937-bbc9-cafa758ebf56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.850295 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lfjs\" (UniqueName: \"kubernetes.io/projected/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-kube-api-access-6lfjs\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.850333 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.850343 4846 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.850353 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.862935 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data" (OuterVolumeSpecName: "config-data") pod "96c0f5ab-1d78-4937-bbc9-cafa758ebf56" (UID: "96c0f5ab-1d78-4937-bbc9-cafa758ebf56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.943671 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.950836 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.952094 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c0f5ab-1d78-4937-bbc9-cafa758ebf56-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.974415 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.974883 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api-log" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.974906 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api-log" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.974922 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="cinder-scheduler" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.974930 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="cinder-scheduler" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.974957 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9550585-5ddb-45d1-9471-884d030282fb" containerName="init" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.974964 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9550585-5ddb-45d1-9471-884d030282fb" containerName="init" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.974977 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="probe" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.974983 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="probe" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.975012 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975032 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api" Nov 22 09:32:53 crc kubenswrapper[4846]: E1122 09:32:53.975059 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9550585-5ddb-45d1-9471-884d030282fb" containerName="dnsmasq-dns" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975065 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9550585-5ddb-45d1-9471-884d030282fb" containerName="dnsmasq-dns" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975238 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="cinder-scheduler" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975252 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9550585-5ddb-45d1-9471-884d030282fb" containerName="dnsmasq-dns" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975261 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" containerName="probe" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975267 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api-log" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.975279 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" containerName="barbican-api" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.976418 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.978711 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 09:32:53 crc kubenswrapper[4846]: I1122 09:32:53.984574 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.048228 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff9a258-1c0d-4003-8261-8664a42d0091" path="/var/lib/kubelet/pods/2ff9a258-1c0d-4003-8261-8664a42d0091/volumes" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.049312 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c0f5ab-1d78-4937-bbc9-cafa758ebf56" path="/var/lib/kubelet/pods/96c0f5ab-1d78-4937-bbc9-cafa758ebf56/volumes" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.054662 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.054712 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-config-data\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.054807 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-scripts\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.054832 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.054874 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.054911 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5wpr\" (UniqueName: \"kubernetes.io/projected/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-kube-api-access-g5wpr\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157154 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157233 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5wpr\" (UniqueName: \"kubernetes.io/projected/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-kube-api-access-g5wpr\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157290 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157325 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-config-data\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157398 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-scripts\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157422 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.157513 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.169901 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.170112 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.172331 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-scripts\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.173391 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-config-data\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.185805 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5wpr\" (UniqueName: \"kubernetes.io/projected/ea0ad07d-59fe-4c26-b1a7-69b9181631d8-kube-api-access-g5wpr\") pod \"cinder-scheduler-0\" (UID: \"ea0ad07d-59fe-4c26-b1a7-69b9181631d8\") " pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.293452 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:32:54 crc kubenswrapper[4846]: I1122 09:32:54.834408 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:32:55 crc kubenswrapper[4846]: I1122 09:32:55.645560 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea0ad07d-59fe-4c26-b1a7-69b9181631d8","Type":"ContainerStarted","Data":"15fd9424bb0f8cb3dc2dd2377b7b93c084b3f8a9ef3f4be6ef951b344ae5add1"} Nov 22 09:32:55 crc kubenswrapper[4846]: I1122 09:32:55.646382 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea0ad07d-59fe-4c26-b1a7-69b9181631d8","Type":"ContainerStarted","Data":"394f6e9e878dfb9d6cbe806e47629266a5ad05b5f7afc012cd3457ac6cb96492"} Nov 22 09:32:55 crc kubenswrapper[4846]: I1122 09:32:55.899313 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 22 09:32:56 crc kubenswrapper[4846]: I1122 09:32:56.662646 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ea0ad07d-59fe-4c26-b1a7-69b9181631d8","Type":"ContainerStarted","Data":"b73d9efe3d90f2ae11e4f16ae45aaf75f0618054f374f59979f2fabe994acf2a"} Nov 22 09:32:56 crc kubenswrapper[4846]: I1122 09:32:56.691942 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.69191454 podStartE2EDuration="3.69191454s" podCreationTimestamp="2025-11-22 09:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:32:56.683557076 +0000 UTC m=+1151.619246745" watchObservedRunningTime="2025-11-22 09:32:56.69191454 +0000 UTC m=+1151.627604189" Nov 22 09:32:56 crc kubenswrapper[4846]: I1122 09:32:56.883648 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5ccd94b5cf-fd5rp" Nov 22 09:32:57 crc kubenswrapper[4846]: I1122 09:32:57.160798 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:57 crc kubenswrapper[4846]: I1122 09:32:57.177915 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-569956d6b4-jtk8r" Nov 22 09:32:57 crc kubenswrapper[4846]: I1122 09:32:57.178420 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 09:32:58 crc kubenswrapper[4846]: I1122 09:32:58.626253 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:32:58 crc kubenswrapper[4846]: I1122 09:32:58.626341 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:32:59 crc kubenswrapper[4846]: I1122 09:32:59.294242 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.127501 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.129020 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.131326 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.131722 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.131866 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tg7jz" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.140094 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.275255 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37cf7b-6c4e-44c5-8193-38a0888efeee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.275744 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e37cf7b-6c4e-44c5-8193-38a0888efeee-openstack-config\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.276118 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e37cf7b-6c4e-44c5-8193-38a0888efeee-openstack-config-secret\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.276289 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v87k\" (UniqueName: \"kubernetes.io/projected/0e37cf7b-6c4e-44c5-8193-38a0888efeee-kube-api-access-9v87k\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.379073 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e37cf7b-6c4e-44c5-8193-38a0888efeee-openstack-config\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.379658 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e37cf7b-6c4e-44c5-8193-38a0888efeee-openstack-config-secret\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.379788 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v87k\" (UniqueName: \"kubernetes.io/projected/0e37cf7b-6c4e-44c5-8193-38a0888efeee-kube-api-access-9v87k\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.379914 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37cf7b-6c4e-44c5-8193-38a0888efeee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.380589 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0e37cf7b-6c4e-44c5-8193-38a0888efeee-openstack-config\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.388620 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0e37cf7b-6c4e-44c5-8193-38a0888efeee-openstack-config-secret\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.396131 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e37cf7b-6c4e-44c5-8193-38a0888efeee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.401209 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v87k\" (UniqueName: \"kubernetes.io/projected/0e37cf7b-6c4e-44c5-8193-38a0888efeee-kube-api-access-9v87k\") pod \"openstackclient\" (UID: \"0e37cf7b-6c4e-44c5-8193-38a0888efeee\") " pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.465763 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:33:00 crc kubenswrapper[4846]: I1122 09:33:00.956775 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 09:33:01 crc kubenswrapper[4846]: I1122 09:33:01.717843 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0e37cf7b-6c4e-44c5-8193-38a0888efeee","Type":"ContainerStarted","Data":"18a4ad00ae640be758f5f8f8e3cbc89a56b93e9da585211d6fc415517d197f44"} Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.112569 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-86d575f679-k6l72"] Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.116307 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.119010 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.123832 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.123854 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.126648 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-86d575f679-k6l72"] Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.180930 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtm6\" (UniqueName: \"kubernetes.io/projected/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-kube-api-access-zgtm6\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181000 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-combined-ca-bundle\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181129 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-log-httpd\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181157 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-run-httpd\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181196 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-etc-swift\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181221 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-public-tls-certs\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181259 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-internal-tls-certs\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.181291 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-config-data\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283225 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-etc-swift\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283286 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-public-tls-certs\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283332 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-internal-tls-certs\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283368 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-config-data\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283403 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtm6\" (UniqueName: \"kubernetes.io/projected/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-kube-api-access-zgtm6\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283437 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-combined-ca-bundle\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283478 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-log-httpd\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.283500 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-run-httpd\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.284201 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-run-httpd\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.285167 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-log-httpd\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.291106 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-combined-ca-bundle\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.291750 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-public-tls-certs\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.292060 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-internal-tls-certs\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.292640 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-etc-swift\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.301239 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-config-data\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.303782 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtm6\" (UniqueName: \"kubernetes.io/projected/52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2-kube-api-access-zgtm6\") pod \"swift-proxy-86d575f679-k6l72\" (UID: \"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2\") " pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.449941 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:04 crc kubenswrapper[4846]: I1122 09:33:04.585460 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.001112 4846 scope.go:117] "RemoveContainer" containerID="544d578048e4f1eb9148ccb4174b0e98f61dcd8d4564d0e95b6f12613d4ac887" Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.064252 4846 scope.go:117] "RemoveContainer" containerID="149e21900294b3e3b026d44c7abe53723a541f3362b1612331e73a182737b796" Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.067722 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-86d575f679-k6l72"] Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.173135 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.173596 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-central-agent" containerID="cri-o://6061f65561a394ed9574c6f0411253104dc193c1b1bd52e82a200ac29c97d9fc" gracePeriod=30 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.173908 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="proxy-httpd" containerID="cri-o://bcc47003a42ed33cb3107ab0ddc358bc36e1a6b16345ab14def3ca462bb1d8e2" gracePeriod=30 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.173978 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="sg-core" containerID="cri-o://07bcbfdb85e5ee64deffec7247b7af6209005a4dc0f0740f924239ac5cbb73cd" gracePeriod=30 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.174076 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-notification-agent" containerID="cri-o://7c64a3ee6d2066e9cfae66641815dcc2ef95419e93910d7c9a57c1b33f5d640e" gracePeriod=30 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.182901 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.247399 4846 scope.go:117] "RemoveContainer" containerID="e72b9dd985087c95448080b5d6c149c46ad6f0d2cfe49d92125df692e33df836" Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.773657 4846 generic.go:334] "Generic (PLEG): container finished" podID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerID="bcc47003a42ed33cb3107ab0ddc358bc36e1a6b16345ab14def3ca462bb1d8e2" exitCode=0 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.774169 4846 generic.go:334] "Generic (PLEG): container finished" podID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerID="07bcbfdb85e5ee64deffec7247b7af6209005a4dc0f0740f924239ac5cbb73cd" exitCode=2 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.774184 4846 generic.go:334] "Generic (PLEG): container finished" podID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerID="6061f65561a394ed9574c6f0411253104dc193c1b1bd52e82a200ac29c97d9fc" exitCode=0 Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.773870 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerDied","Data":"bcc47003a42ed33cb3107ab0ddc358bc36e1a6b16345ab14def3ca462bb1d8e2"} Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.774353 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerDied","Data":"07bcbfdb85e5ee64deffec7247b7af6209005a4dc0f0740f924239ac5cbb73cd"} Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.774397 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerDied","Data":"6061f65561a394ed9574c6f0411253104dc193c1b1bd52e82a200ac29c97d9fc"} Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.776880 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-86d575f679-k6l72" event={"ID":"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2","Type":"ContainerStarted","Data":"8558b03d3ccdd1f5cbc53beab6581bb09b69f251ad1dc4aab6391e7c3e17b661"} Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.776936 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-86d575f679-k6l72" event={"ID":"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2","Type":"ContainerStarted","Data":"7213d7235c344980990c3c0ab7c2f5a79ad945341330b6daee645c8ec58a454f"} Nov 22 09:33:05 crc kubenswrapper[4846]: I1122 09:33:05.898584 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8564f79874-c88vw" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 22 09:33:07 crc kubenswrapper[4846]: I1122 09:33:07.808149 4846 generic.go:334] "Generic (PLEG): container finished" podID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerID="7c64a3ee6d2066e9cfae66641815dcc2ef95419e93910d7c9a57c1b33f5d640e" exitCode=0 Nov 22 09:33:07 crc kubenswrapper[4846]: I1122 09:33:07.808215 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerDied","Data":"7c64a3ee6d2066e9cfae66641815dcc2ef95419e93910d7c9a57c1b33f5d640e"} Nov 22 09:33:08 crc kubenswrapper[4846]: I1122 09:33:08.108841 4846 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod584aeb0f-b1a9-4a6e-b129-b21593065b18"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod584aeb0f-b1a9-4a6e-b129-b21593065b18] : Timed out while waiting for systemd to remove kubepods-besteffort-pod584aeb0f_b1a9_4a6e_b129_b21593065b18.slice" Nov 22 09:33:08 crc kubenswrapper[4846]: E1122 09:33:08.108938 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod584aeb0f-b1a9-4a6e-b129-b21593065b18] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod584aeb0f-b1a9-4a6e-b129-b21593065b18] : Timed out while waiting for systemd to remove kubepods-besteffort-pod584aeb0f_b1a9_4a6e_b129_b21593065b18.slice" pod="openstack/neutron-db-sync-9m5n9" podUID="584aeb0f-b1a9-4a6e-b129-b21593065b18" Nov 22 09:33:08 crc kubenswrapper[4846]: I1122 09:33:08.619388 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:33:08 crc kubenswrapper[4846]: I1122 09:33:08.718745 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": dial tcp 10.217.0.164:3000: connect: connection refused" Nov 22 09:33:08 crc kubenswrapper[4846]: I1122 09:33:08.837535 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9m5n9" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.799020 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.872196 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350dff5e-52cf-4530-9527-46f8c8dc3487","Type":"ContainerDied","Data":"44ff705219db0167bcbd9e00d2956036afc459c3acb50e85aa5c19115a0411b7"} Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.872266 4846 scope.go:117] "RemoveContainer" containerID="bcc47003a42ed33cb3107ab0ddc358bc36e1a6b16345ab14def3ca462bb1d8e2" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.872454 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.878213 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0e37cf7b-6c4e-44c5-8193-38a0888efeee","Type":"ContainerStarted","Data":"0360d3c7109a93ed94e2cb9a78063fddbb8b060b3003194b41fa7c42c4bbae48"} Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.884254 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-86d575f679-k6l72" event={"ID":"52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2","Type":"ContainerStarted","Data":"0f22ca3f3ff3d3687de0f43b12e76023ea71b25ac61ee3b8836209b94fa9bccb"} Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.884997 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.901820 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.348097871 podStartE2EDuration="11.90179694s" podCreationTimestamp="2025-11-22 09:33:00 +0000 UTC" firstStartedPulling="2025-11-22 09:33:00.979686928 +0000 UTC m=+1155.915376577" lastFinishedPulling="2025-11-22 09:33:11.533385997 +0000 UTC m=+1166.469075646" observedRunningTime="2025-11-22 09:33:11.892748395 +0000 UTC m=+1166.828438044" watchObservedRunningTime="2025-11-22 09:33:11.90179694 +0000 UTC m=+1166.837486589" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.905858 4846 scope.go:117] "RemoveContainer" containerID="07bcbfdb85e5ee64deffec7247b7af6209005a4dc0f0740f924239ac5cbb73cd" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.924768 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-86d575f679-k6l72" podStartSLOduration=7.924737592 podStartE2EDuration="7.924737592s" podCreationTimestamp="2025-11-22 09:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:11.920402495 +0000 UTC m=+1166.856092144" watchObservedRunningTime="2025-11-22 09:33:11.924737592 +0000 UTC m=+1166.860427241" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.940917 4846 scope.go:117] "RemoveContainer" containerID="7c64a3ee6d2066e9cfae66641815dcc2ef95419e93910d7c9a57c1b33f5d640e" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.969483 4846 scope.go:117] "RemoveContainer" containerID="6061f65561a394ed9574c6f0411253104dc193c1b1bd52e82a200ac29c97d9fc" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.992894 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-scripts\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.993175 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-sg-core-conf-yaml\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.993221 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-config-data\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.993289 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2ftr\" (UniqueName: \"kubernetes.io/projected/350dff5e-52cf-4530-9527-46f8c8dc3487-kube-api-access-x2ftr\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.993375 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-run-httpd\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.993432 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-log-httpd\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.993472 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-combined-ca-bundle\") pod \"350dff5e-52cf-4530-9527-46f8c8dc3487\" (UID: \"350dff5e-52cf-4530-9527-46f8c8dc3487\") " Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.994478 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.994803 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:11 crc kubenswrapper[4846]: I1122 09:33:11.999409 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-scripts" (OuterVolumeSpecName: "scripts") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.000439 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/350dff5e-52cf-4530-9527-46f8c8dc3487-kube-api-access-x2ftr" (OuterVolumeSpecName: "kube-api-access-x2ftr") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "kube-api-access-x2ftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.039941 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.090378 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.095673 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.095706 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.095715 4846 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.095725 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2ftr\" (UniqueName: \"kubernetes.io/projected/350dff5e-52cf-4530-9527-46f8c8dc3487-kube-api-access-x2ftr\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.095737 4846 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.095745 4846 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350dff5e-52cf-4530-9527-46f8c8dc3487-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.108148 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-config-data" (OuterVolumeSpecName: "config-data") pod "350dff5e-52cf-4530-9527-46f8c8dc3487" (UID: "350dff5e-52cf-4530-9527-46f8c8dc3487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.198749 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350dff5e-52cf-4530-9527-46f8c8dc3487-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.206942 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.217123 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.286001 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:12 crc kubenswrapper[4846]: E1122 09:33:12.287560 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="proxy-httpd" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.287584 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="proxy-httpd" Nov 22 09:33:12 crc kubenswrapper[4846]: E1122 09:33:12.287747 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-notification-agent" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.287771 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-notification-agent" Nov 22 09:33:12 crc kubenswrapper[4846]: E1122 09:33:12.288002 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-central-agent" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.288018 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-central-agent" Nov 22 09:33:12 crc kubenswrapper[4846]: E1122 09:33:12.288505 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="sg-core" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.288573 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="sg-core" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.289163 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="sg-core" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.289197 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="proxy-httpd" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.289235 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-central-agent" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.289255 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" containerName="ceilometer-notification-agent" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.311714 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.313716 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.326422 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.326683 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511424 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511490 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511508 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511556 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnsmm\" (UniqueName: \"kubernetes.io/projected/dd8c33e0-fb6f-480a-9916-48c51af17009-kube-api-access-xnsmm\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511571 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-scripts\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511604 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-config-data\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.511718 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615100 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615199 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615270 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615308 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615404 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnsmm\" (UniqueName: \"kubernetes.io/projected/dd8c33e0-fb6f-480a-9916-48c51af17009-kube-api-access-xnsmm\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615426 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-scripts\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615891 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-config-data\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.615981 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.616401 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.621924 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.624156 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-scripts\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.624259 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.625187 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-config-data\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.655841 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnsmm\" (UniqueName: \"kubernetes.io/projected/dd8c33e0-fb6f-480a-9916-48c51af17009-kube-api-access-xnsmm\") pod \"ceilometer-0\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: W1122 09:33:12.860520 4846 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79042af_3413_4614_a787_72fdd7fc91d7.slice/crio-5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9": error while statting cgroup v2: [unable to parse /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79042af_3413_4614_a787_72fdd7fc91d7.slice/crio-5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9/memory.stat: read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79042af_3413_4614_a787_72fdd7fc91d7.slice/crio-5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9/memory.stat: no such device], continuing to push stats Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.936905 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.940352 4846 generic.go:334] "Generic (PLEG): container finished" podID="f79042af-3413-4614-a787-72fdd7fc91d7" containerID="27001cb788644216b3d4184a41d7a2c4b77ab48f03fe634319d55ff847ebacea" exitCode=137 Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.940515 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8564f79874-c88vw" event={"ID":"f79042af-3413-4614-a787-72fdd7fc91d7","Type":"ContainerDied","Data":"27001cb788644216b3d4184a41d7a2c4b77ab48f03fe634319d55ff847ebacea"} Nov 22 09:33:12 crc kubenswrapper[4846]: I1122 09:33:12.960447 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.067868 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:13 crc kubenswrapper[4846]: E1122 09:33:13.299854 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79042af_3413_4614_a787_72fdd7fc91d7.slice/crio-5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9\": RecentStats: unable to find data in memory cache]" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.519294 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.638443 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-scripts\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.638603 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-combined-ca-bundle\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.638630 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-secret-key\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.638767 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79042af-3413-4614-a787-72fdd7fc91d7-logs\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.638805 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-tls-certs\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.639012 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-config-data\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.639126 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scfhz\" (UniqueName: \"kubernetes.io/projected/f79042af-3413-4614-a787-72fdd7fc91d7-kube-api-access-scfhz\") pod \"f79042af-3413-4614-a787-72fdd7fc91d7\" (UID: \"f79042af-3413-4614-a787-72fdd7fc91d7\") " Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.640544 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79042af-3413-4614-a787-72fdd7fc91d7-logs" (OuterVolumeSpecName: "logs") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.648390 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79042af-3413-4614-a787-72fdd7fc91d7-kube-api-access-scfhz" (OuterVolumeSpecName: "kube-api-access-scfhz") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "kube-api-access-scfhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.667733 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.669935 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-config-data" (OuterVolumeSpecName: "config-data") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.707779 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.708319 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: W1122 09:33:13.712009 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8c33e0_fb6f_480a_9916_48c51af17009.slice/crio-43ca787f9992fd66e1df2dc99bef3a15717ced4d5aaca9226d27907e3f65a441 WatchSource:0}: Error finding container 43ca787f9992fd66e1df2dc99bef3a15717ced4d5aaca9226d27907e3f65a441: Status 404 returned error can't find the container with id 43ca787f9992fd66e1df2dc99bef3a15717ced4d5aaca9226d27907e3f65a441 Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.715729 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.741507 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.741544 4846 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.741553 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79042af-3413-4614-a787-72fdd7fc91d7-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.741564 4846 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f79042af-3413-4614-a787-72fdd7fc91d7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.741573 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.741586 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scfhz\" (UniqueName: \"kubernetes.io/projected/f79042af-3413-4614-a787-72fdd7fc91d7-kube-api-access-scfhz\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.745212 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-scripts" (OuterVolumeSpecName: "scripts") pod "f79042af-3413-4614-a787-72fdd7fc91d7" (UID: "f79042af-3413-4614-a787-72fdd7fc91d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.843833 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f79042af-3413-4614-a787-72fdd7fc91d7-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.982804 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerStarted","Data":"43ca787f9992fd66e1df2dc99bef3a15717ced4d5aaca9226d27907e3f65a441"} Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.993979 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8564f79874-c88vw" event={"ID":"f79042af-3413-4614-a787-72fdd7fc91d7","Type":"ContainerDied","Data":"5c97cfbad8cd67e3b029562471229f3dbac1430883e1b3e233c33bc3ee6d9ab9"} Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.994037 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8564f79874-c88vw" Nov 22 09:33:13 crc kubenswrapper[4846]: I1122 09:33:13.994078 4846 scope.go:117] "RemoveContainer" containerID="5a55e9de128b8e237f5cd8fabafd175065c25d630198e7d28d8c6d6779e35778" Nov 22 09:33:14 crc kubenswrapper[4846]: I1122 09:33:14.064944 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="350dff5e-52cf-4530-9527-46f8c8dc3487" path="/var/lib/kubelet/pods/350dff5e-52cf-4530-9527-46f8c8dc3487/volumes" Nov 22 09:33:14 crc kubenswrapper[4846]: I1122 09:33:14.066855 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8564f79874-c88vw"] Nov 22 09:33:14 crc kubenswrapper[4846]: I1122 09:33:14.066913 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8564f79874-c88vw"] Nov 22 09:33:14 crc kubenswrapper[4846]: I1122 09:33:14.201733 4846 scope.go:117] "RemoveContainer" containerID="27001cb788644216b3d4184a41d7a2c4b77ab48f03fe634319d55ff847ebacea" Nov 22 09:33:14 crc kubenswrapper[4846]: I1122 09:33:14.417583 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.012448 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerStarted","Data":"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3"} Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.437035 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f98fdfc57-v8bnv" Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.519909 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57dccfc6dd-fnk6j"] Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.520704 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57dccfc6dd-fnk6j" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-api" containerID="cri-o://306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586" gracePeriod=30 Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.520790 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57dccfc6dd-fnk6j" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-httpd" containerID="cri-o://dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce" gracePeriod=30 Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.593308 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.593596 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-log" containerID="cri-o://994e2e776f1e64c9a74b2724543c9d1fd2373e411e5de36b390558e153cd751d" gracePeriod=30 Nov 22 09:33:15 crc kubenswrapper[4846]: I1122 09:33:15.593782 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-httpd" containerID="cri-o://24840462d593fd5e9db8ae83104c7d8c71a5b51e0c96fa0c54e9d5765805d224" gracePeriod=30 Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.046588 4846 generic.go:334] "Generic (PLEG): container finished" podID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerID="dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce" exitCode=0 Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.051296 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" path="/var/lib/kubelet/pods/f79042af-3413-4614-a787-72fdd7fc91d7/volumes" Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.051992 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57dccfc6dd-fnk6j" event={"ID":"4c879ac4-a859-4369-82eb-fc980f7a2881","Type":"ContainerDied","Data":"dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce"} Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.068393 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerStarted","Data":"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb"} Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.068482 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerStarted","Data":"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a"} Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.079421 4846 generic.go:334] "Generic (PLEG): container finished" podID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerID="994e2e776f1e64c9a74b2724543c9d1fd2373e411e5de36b390558e153cd751d" exitCode=143 Nov 22 09:33:16 crc kubenswrapper[4846]: I1122 09:33:16.079476 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faac3725-9476-4d5c-b3a2-f927e4fe7af1","Type":"ContainerDied","Data":"994e2e776f1e64c9a74b2724543c9d1fd2373e411e5de36b390558e153cd751d"} Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.104024 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerStarted","Data":"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf"} Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.106107 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.104307 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="sg-core" containerID="cri-o://e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" gracePeriod=30 Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.104216 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-central-agent" containerID="cri-o://a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" gracePeriod=30 Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.104387 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-notification-agent" containerID="cri-o://bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" gracePeriod=30 Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.104347 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="proxy-httpd" containerID="cri-o://03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" gracePeriod=30 Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.127500 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6390559209999997 podStartE2EDuration="6.127473449s" podCreationTimestamp="2025-11-22 09:33:12 +0000 UTC" firstStartedPulling="2025-11-22 09:33:13.716610817 +0000 UTC m=+1168.652300476" lastFinishedPulling="2025-11-22 09:33:17.205028355 +0000 UTC m=+1172.140718004" observedRunningTime="2025-11-22 09:33:18.124832612 +0000 UTC m=+1173.060522281" watchObservedRunningTime="2025-11-22 09:33:18.127473449 +0000 UTC m=+1173.063163098" Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.210387 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.210715 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-log" containerID="cri-o://0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46" gracePeriod=30 Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.210913 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-httpd" containerID="cri-o://ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077" gracePeriod=30 Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.892331 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.977615 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-config\") pod \"4c879ac4-a859-4369-82eb-fc980f7a2881\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.977767 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-httpd-config\") pod \"4c879ac4-a859-4369-82eb-fc980f7a2881\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.977794 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-combined-ca-bundle\") pod \"4c879ac4-a859-4369-82eb-fc980f7a2881\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.977880 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpn8n\" (UniqueName: \"kubernetes.io/projected/4c879ac4-a859-4369-82eb-fc980f7a2881-kube-api-access-fpn8n\") pod \"4c879ac4-a859-4369-82eb-fc980f7a2881\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.977949 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-ovndb-tls-certs\") pod \"4c879ac4-a859-4369-82eb-fc980f7a2881\" (UID: \"4c879ac4-a859-4369-82eb-fc980f7a2881\") " Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.984361 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4c879ac4-a859-4369-82eb-fc980f7a2881" (UID: "4c879ac4-a859-4369-82eb-fc980f7a2881"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:18 crc kubenswrapper[4846]: I1122 09:33:18.984828 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c879ac4-a859-4369-82eb-fc980f7a2881-kube-api-access-fpn8n" (OuterVolumeSpecName: "kube-api-access-fpn8n") pod "4c879ac4-a859-4369-82eb-fc980f7a2881" (UID: "4c879ac4-a859-4369-82eb-fc980f7a2881"). InnerVolumeSpecName "kube-api-access-fpn8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.026683 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.051269 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c879ac4-a859-4369-82eb-fc980f7a2881" (UID: "4c879ac4-a859-4369-82eb-fc980f7a2881"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.084656 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpn8n\" (UniqueName: \"kubernetes.io/projected/4c879ac4-a859-4369-82eb-fc980f7a2881-kube-api-access-fpn8n\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.085296 4846 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.085314 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.089401 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-config" (OuterVolumeSpecName: "config") pod "4c879ac4-a859-4369-82eb-fc980f7a2881" (UID: "4c879ac4-a859-4369-82eb-fc980f7a2881"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.096187 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4c879ac4-a859-4369-82eb-fc980f7a2881" (UID: "4c879ac4-a859-4369-82eb-fc980f7a2881"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130807 4846 generic.go:334] "Generic (PLEG): container finished" podID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" exitCode=0 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130845 4846 generic.go:334] "Generic (PLEG): container finished" podID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" exitCode=2 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130855 4846 generic.go:334] "Generic (PLEG): container finished" podID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" exitCode=0 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130866 4846 generic.go:334] "Generic (PLEG): container finished" podID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" exitCode=0 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130923 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerDied","Data":"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130964 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerDied","Data":"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130977 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerDied","Data":"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.130990 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerDied","Data":"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.131005 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8c33e0-fb6f-480a-9916-48c51af17009","Type":"ContainerDied","Data":"43ca787f9992fd66e1df2dc99bef3a15717ced4d5aaca9226d27907e3f65a441"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.131026 4846 scope.go:117] "RemoveContainer" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.131268 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.138266 4846 generic.go:334] "Generic (PLEG): container finished" podID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerID="24840462d593fd5e9db8ae83104c7d8c71a5b51e0c96fa0c54e9d5765805d224" exitCode=0 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.138349 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faac3725-9476-4d5c-b3a2-f927e4fe7af1","Type":"ContainerDied","Data":"24840462d593fd5e9db8ae83104c7d8c71a5b51e0c96fa0c54e9d5765805d224"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.142752 4846 generic.go:334] "Generic (PLEG): container finished" podID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerID="0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46" exitCode=143 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.142851 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0","Type":"ContainerDied","Data":"0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.148791 4846 generic.go:334] "Generic (PLEG): container finished" podID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerID="306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586" exitCode=0 Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.148852 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57dccfc6dd-fnk6j" event={"ID":"4c879ac4-a859-4369-82eb-fc980f7a2881","Type":"ContainerDied","Data":"306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.148888 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57dccfc6dd-fnk6j" event={"ID":"4c879ac4-a859-4369-82eb-fc980f7a2881","Type":"ContainerDied","Data":"55bb3373036b55fd2070f540addc15bc7e58eddc07597d1447d91b376b4caac9"} Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.148974 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57dccfc6dd-fnk6j" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187210 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-sg-core-conf-yaml\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187273 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnsmm\" (UniqueName: \"kubernetes.io/projected/dd8c33e0-fb6f-480a-9916-48c51af17009-kube-api-access-xnsmm\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187335 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-run-httpd\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187388 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-combined-ca-bundle\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187508 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-scripts\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187541 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-log-httpd\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.187557 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-config-data\") pod \"dd8c33e0-fb6f-480a-9916-48c51af17009\" (UID: \"dd8c33e0-fb6f-480a-9916-48c51af17009\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.189773 4846 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.189795 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c879ac4-a859-4369-82eb-fc980f7a2881-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.193589 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.194035 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.202187 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8c33e0-fb6f-480a-9916-48c51af17009-kube-api-access-xnsmm" (OuterVolumeSpecName: "kube-api-access-xnsmm") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "kube-api-access-xnsmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.206226 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-scripts" (OuterVolumeSpecName: "scripts") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.206374 4846 scope.go:117] "RemoveContainer" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.219877 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57dccfc6dd-fnk6j"] Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.229552 4846 scope.go:117] "RemoveContainer" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.229777 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.234797 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57dccfc6dd-fnk6j"] Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.240661 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.274241 4846 scope.go:117] "RemoveContainer" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.292522 4846 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.292545 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnsmm\" (UniqueName: \"kubernetes.io/projected/dd8c33e0-fb6f-480a-9916-48c51af17009-kube-api-access-xnsmm\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.292555 4846 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.292564 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.292576 4846 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8c33e0-fb6f-480a-9916-48c51af17009-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.344874 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.390337 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-config-data" (OuterVolumeSpecName: "config-data") pod "dd8c33e0-fb6f-480a-9916-48c51af17009" (UID: "dd8c33e0-fb6f-480a-9916-48c51af17009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396176 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-public-tls-certs\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396328 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396375 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-combined-ca-bundle\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396520 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-config-data\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396550 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-logs\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396628 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-scripts\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396650 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr5bd\" (UniqueName: \"kubernetes.io/projected/faac3725-9476-4d5c-b3a2-f927e4fe7af1-kube-api-access-gr5bd\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.396676 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-httpd-run\") pod \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\" (UID: \"faac3725-9476-4d5c-b3a2-f927e4fe7af1\") " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.397358 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.397371 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8c33e0-fb6f-480a-9916-48c51af17009-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.397983 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.398283 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-logs" (OuterVolumeSpecName: "logs") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.404862 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faac3725-9476-4d5c-b3a2-f927e4fe7af1-kube-api-access-gr5bd" (OuterVolumeSpecName: "kube-api-access-gr5bd") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "kube-api-access-gr5bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.404868 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-scripts" (OuterVolumeSpecName: "scripts") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.413238 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.435455 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.456173 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-86d575f679-k6l72" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.462614 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.470654 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-config-data" (OuterVolumeSpecName: "config-data") pod "faac3725-9476-4d5c-b3a2-f927e4fe7af1" (UID: "faac3725-9476-4d5c-b3a2-f927e4fe7af1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522209 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522255 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522269 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522279 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522290 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522302 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr5bd\" (UniqueName: \"kubernetes.io/projected/faac3725-9476-4d5c-b3a2-f927e4fe7af1-kube-api-access-gr5bd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522314 4846 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/faac3725-9476-4d5c-b3a2-f927e4fe7af1-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.522324 4846 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/faac3725-9476-4d5c-b3a2-f927e4fe7af1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.553168 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.575609 4846 scope.go:117] "RemoveContainer" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.576215 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": container with ID starting with 03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf not found: ID does not exist" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.576635 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf"} err="failed to get container status \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": rpc error: code = NotFound desc = could not find container \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": container with ID starting with 03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.576712 4846 scope.go:117] "RemoveContainer" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.577120 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": container with ID starting with e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb not found: ID does not exist" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.577175 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb"} err="failed to get container status \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": rpc error: code = NotFound desc = could not find container \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": container with ID starting with e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.577211 4846 scope.go:117] "RemoveContainer" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.577578 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": container with ID starting with bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a not found: ID does not exist" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.577654 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a"} err="failed to get container status \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": rpc error: code = NotFound desc = could not find container \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": container with ID starting with bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.577698 4846 scope.go:117] "RemoveContainer" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.578191 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": container with ID starting with a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3 not found: ID does not exist" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.578217 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3"} err="failed to get container status \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": rpc error: code = NotFound desc = could not find container \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": container with ID starting with a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3 not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.578233 4846 scope.go:117] "RemoveContainer" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.579413 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf"} err="failed to get container status \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": rpc error: code = NotFound desc = could not find container \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": container with ID starting with 03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.579467 4846 scope.go:117] "RemoveContainer" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.585883 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb"} err="failed to get container status \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": rpc error: code = NotFound desc = could not find container \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": container with ID starting with e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.585955 4846 scope.go:117] "RemoveContainer" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.586659 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a"} err="failed to get container status \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": rpc error: code = NotFound desc = could not find container \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": container with ID starting with bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.586707 4846 scope.go:117] "RemoveContainer" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.587267 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3"} err="failed to get container status \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": rpc error: code = NotFound desc = could not find container \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": container with ID starting with a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3 not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.587326 4846 scope.go:117] "RemoveContainer" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.587872 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf"} err="failed to get container status \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": rpc error: code = NotFound desc = could not find container \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": container with ID starting with 03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.587940 4846 scope.go:117] "RemoveContainer" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.588627 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb"} err="failed to get container status \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": rpc error: code = NotFound desc = could not find container \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": container with ID starting with e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.588669 4846 scope.go:117] "RemoveContainer" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.589574 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a"} err="failed to get container status \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": rpc error: code = NotFound desc = could not find container \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": container with ID starting with bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.589640 4846 scope.go:117] "RemoveContainer" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.590284 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3"} err="failed to get container status \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": rpc error: code = NotFound desc = could not find container \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": container with ID starting with a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3 not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.590348 4846 scope.go:117] "RemoveContainer" containerID="03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.590729 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf"} err="failed to get container status \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": rpc error: code = NotFound desc = could not find container \"03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf\": container with ID starting with 03383a84dd1ec9202a51dde3855b61130631bacf1e5a2f7b18589bd3532e67bf not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.590799 4846 scope.go:117] "RemoveContainer" containerID="e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.591154 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb"} err="failed to get container status \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": rpc error: code = NotFound desc = could not find container \"e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb\": container with ID starting with e5a6908cf24336fe37eb9d633831e82e396a112d21903572e573e38128030afb not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.591187 4846 scope.go:117] "RemoveContainer" containerID="bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.591656 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a"} err="failed to get container status \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": rpc error: code = NotFound desc = could not find container \"bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a\": container with ID starting with bc327549e9289a24c6d6a40849c0b817f03290702266ea5977ce846222dec53a not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.591687 4846 scope.go:117] "RemoveContainer" containerID="a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.592086 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3"} err="failed to get container status \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": rpc error: code = NotFound desc = could not find container \"a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3\": container with ID starting with a31a20285e37b8e92c14f6c3b0691f2ea4bc652960773426cf116b0e0bcea2c3 not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.592121 4846 scope.go:117] "RemoveContainer" containerID="dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.596704 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.619782 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.626721 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.627121 4846 scope.go:117] "RemoveContainer" containerID="306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.642443 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643035 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-central-agent" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643077 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-central-agent" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643111 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="proxy-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643121 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="proxy-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643149 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643157 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643173 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-log" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643180 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-log" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643192 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643200 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643211 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643218 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643242 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="sg-core" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643249 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="sg-core" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643268 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-api" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643278 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-api" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643288 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-notification-agent" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643296 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-notification-agent" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.643308 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon-log" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643314 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon-log" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643529 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-notification-agent" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643547 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="ceilometer-central-agent" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643556 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="proxy-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643569 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643585 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643597 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-httpd" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643615 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" containerName="neutron-api" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643627 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" containerName="glance-log" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643644 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79042af-3413-4614-a787-72fdd7fc91d7" containerName="horizon-log" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.643652 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" containerName="sg-core" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.646978 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.650803 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.651876 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.653866 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.670016 4846 scope.go:117] "RemoveContainer" containerID="dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.674261 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce\": container with ID starting with dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce not found: ID does not exist" containerID="dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.674339 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce"} err="failed to get container status \"dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce\": rpc error: code = NotFound desc = could not find container \"dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce\": container with ID starting with dedc86c7a0cd3042f0cafcb40a843f1a01d104c7bec69527fb6874a116600cce not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.674381 4846 scope.go:117] "RemoveContainer" containerID="306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586" Nov 22 09:33:19 crc kubenswrapper[4846]: E1122 09:33:19.674946 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586\": container with ID starting with 306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586 not found: ID does not exist" containerID="306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.675006 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586"} err="failed to get container status \"306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586\": rpc error: code = NotFound desc = could not find container \"306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586\": container with ID starting with 306f270db3fc89eb417d390f84e6200a6555d148c79e89e5509c7f8409e75586 not found: ID does not exist" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.832503 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-log-httpd\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.832713 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fxk\" (UniqueName: \"kubernetes.io/projected/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-kube-api-access-f8fxk\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.832775 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.832845 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.832925 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-run-httpd\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.833097 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-config-data\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.833188 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-scripts\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935021 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fxk\" (UniqueName: \"kubernetes.io/projected/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-kube-api-access-f8fxk\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935104 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935160 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935211 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-run-httpd\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935236 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-config-data\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935256 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-scripts\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.935280 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-log-httpd\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.936505 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-run-httpd\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.936644 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-log-httpd\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.939706 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.942739 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.942983 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-config-data\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.943559 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-scripts\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.952632 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fxk\" (UniqueName: \"kubernetes.io/projected/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-kube-api-access-f8fxk\") pod \"ceilometer-0\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " pod="openstack/ceilometer-0" Nov 22 09:33:19 crc kubenswrapper[4846]: I1122 09:33:19.991268 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.054387 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c879ac4-a859-4369-82eb-fc980f7a2881" path="/var/lib/kubelet/pods/4c879ac4-a859-4369-82eb-fc980f7a2881/volumes" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.055648 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8c33e0-fb6f-480a-9916-48c51af17009" path="/var/lib/kubelet/pods/dd8c33e0-fb6f-480a-9916-48c51af17009/volumes" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.201218 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.201429 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"faac3725-9476-4d5c-b3a2-f927e4fe7af1","Type":"ContainerDied","Data":"731c36e606ec67f500855762b5a8744298f51eacec234e86e310165e3d73c31d"} Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.201628 4846 scope.go:117] "RemoveContainer" containerID="24840462d593fd5e9db8ae83104c7d8c71a5b51e0c96fa0c54e9d5765805d224" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.237838 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.251151 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.256383 4846 scope.go:117] "RemoveContainer" containerID="994e2e776f1e64c9a74b2724543c9d1fd2373e411e5de36b390558e153cd751d" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.263706 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.265580 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.268297 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.268634 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.277941 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.451993 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85455dd3-3442-40ad-bd48-80034e877a41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452085 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-config-data\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452168 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452198 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85455dd3-3442-40ad-bd48-80034e877a41-logs\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452240 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452719 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-scripts\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452788 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.452924 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vg4\" (UniqueName: \"kubernetes.io/projected/85455dd3-3442-40ad-bd48-80034e877a41-kube-api-access-56vg4\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.519122 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:20 crc kubenswrapper[4846]: W1122 09:33:20.525731 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ed1b3a1_34e5_4a77_961a_30c5ba68b180.slice/crio-4c7ba19d6704445a62b2f640fd330cf0d3ebb30d5fd8627cd698d73cb9b40c20 WatchSource:0}: Error finding container 4c7ba19d6704445a62b2f640fd330cf0d3ebb30d5fd8627cd698d73cb9b40c20: Status 404 returned error can't find the container with id 4c7ba19d6704445a62b2f640fd330cf0d3ebb30d5fd8627cd698d73cb9b40c20 Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554458 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85455dd3-3442-40ad-bd48-80034e877a41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554510 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-config-data\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554572 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554606 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85455dd3-3442-40ad-bd48-80034e877a41-logs\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554660 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554750 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-scripts\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554767 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.554799 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56vg4\" (UniqueName: \"kubernetes.io/projected/85455dd3-3442-40ad-bd48-80034e877a41-kube-api-access-56vg4\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.555233 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85455dd3-3442-40ad-bd48-80034e877a41-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.555273 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85455dd3-3442-40ad-bd48-80034e877a41-logs\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.555532 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.562253 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.562311 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.563302 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-scripts\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.563397 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85455dd3-3442-40ad-bd48-80034e877a41-config-data\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.580905 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vg4\" (UniqueName: \"kubernetes.io/projected/85455dd3-3442-40ad-bd48-80034e877a41-kube-api-access-56vg4\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.597161 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"85455dd3-3442-40ad-bd48-80034e877a41\") " pod="openstack/glance-default-external-api-0" Nov 22 09:33:20 crc kubenswrapper[4846]: I1122 09:33:20.632951 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.221548 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerStarted","Data":"4c7ba19d6704445a62b2f640fd330cf0d3ebb30d5fd8627cd698d73cb9b40c20"} Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.271385 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:33:21 crc kubenswrapper[4846]: W1122 09:33:21.278540 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85455dd3_3442_40ad_bd48_80034e877a41.slice/crio-a56e25fecb7afdd89a9652eb5121bc6f1eca96b8de4c31f5bee59b30ee82b4ec WatchSource:0}: Error finding container a56e25fecb7afdd89a9652eb5121bc6f1eca96b8de4c31f5bee59b30ee82b4ec: Status 404 returned error can't find the container with id a56e25fecb7afdd89a9652eb5121bc6f1eca96b8de4c31f5bee59b30ee82b4ec Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.947320 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.996668 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fw9w\" (UniqueName: \"kubernetes.io/projected/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-kube-api-access-2fw9w\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.996782 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-scripts\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.996842 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-combined-ca-bundle\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.996878 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-config-data\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.996927 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-logs\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:21 crc kubenswrapper[4846]: I1122 09:33:21.997826 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-logs" (OuterVolumeSpecName: "logs") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.018863 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-kube-api-access-2fw9w" (OuterVolumeSpecName: "kube-api-access-2fw9w") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "kube-api-access-2fw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.022514 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-scripts" (OuterVolumeSpecName: "scripts") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.052603 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faac3725-9476-4d5c-b3a2-f927e4fe7af1" path="/var/lib/kubelet/pods/faac3725-9476-4d5c-b3a2-f927e4fe7af1/volumes" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.098956 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-internal-tls-certs\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.099440 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.099525 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-httpd-run\") pod \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\" (UID: \"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0\") " Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.100192 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.100607 4846 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.100631 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fw9w\" (UniqueName: \"kubernetes.io/projected/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-kube-api-access-2fw9w\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.100643 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.100653 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.105243 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.121188 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.143664 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-config-data" (OuterVolumeSpecName: "config-data") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.185374 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" (UID: "bbe5756d-eccc-4a1b-807b-7a5cd0962ea0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.202738 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.202779 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.202790 4846 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.202833 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.228518 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.244766 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerStarted","Data":"3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713"} Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.244826 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerStarted","Data":"55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c"} Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.256338 4846 generic.go:334] "Generic (PLEG): container finished" podID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerID="ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077" exitCode=0 Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.256528 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0","Type":"ContainerDied","Data":"ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077"} Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.256622 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bbe5756d-eccc-4a1b-807b-7a5cd0962ea0","Type":"ContainerDied","Data":"fa65ba9297be4f8c30299cfa79df1204052ace2022e619bdcbde6129a9372513"} Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.256679 4846 scope.go:117] "RemoveContainer" containerID="ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.257083 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.280984 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85455dd3-3442-40ad-bd48-80034e877a41","Type":"ContainerStarted","Data":"e0818a4937b38f20a4e5e4963cf5f170af61bf425ec7274cbc152b0320e9bbf6"} Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.281066 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85455dd3-3442-40ad-bd48-80034e877a41","Type":"ContainerStarted","Data":"a56e25fecb7afdd89a9652eb5121bc6f1eca96b8de4c31f5bee59b30ee82b4ec"} Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.303800 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.373517 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.382838 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.392234 4846 scope.go:117] "RemoveContainer" containerID="0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.408302 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:33:22 crc kubenswrapper[4846]: E1122 09:33:22.409020 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-httpd" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.409076 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-httpd" Nov 22 09:33:22 crc kubenswrapper[4846]: E1122 09:33:22.409090 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-log" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.409097 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-log" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.409365 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-log" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.409412 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" containerName="glance-httpd" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.411142 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.416165 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.416407 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.424878 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.470413 4846 scope.go:117] "RemoveContainer" containerID="ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077" Nov 22 09:33:22 crc kubenswrapper[4846]: E1122 09:33:22.470975 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077\": container with ID starting with ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077 not found: ID does not exist" containerID="ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.471061 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077"} err="failed to get container status \"ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077\": rpc error: code = NotFound desc = could not find container \"ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077\": container with ID starting with ac3d76164db931358163e0e76bd750cad2432ec5fde5e91271cf6e7cb89df077 not found: ID does not exist" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.471098 4846 scope.go:117] "RemoveContainer" containerID="0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46" Nov 22 09:33:22 crc kubenswrapper[4846]: E1122 09:33:22.472216 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46\": container with ID starting with 0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46 not found: ID does not exist" containerID="0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.472240 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46"} err="failed to get container status \"0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46\": rpc error: code = NotFound desc = could not find container \"0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46\": container with ID starting with 0178a22058a14ef4211dd7e1c10b947c3c3c71ebca8860def0d0a64b9b696a46 not found: ID does not exist" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.609614 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.609859 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmxvc\" (UniqueName: \"kubernetes.io/projected/554a6b70-9c9c-4afd-9738-d207b3067a30-kube-api-access-lmxvc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.610116 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/554a6b70-9c9c-4afd-9738-d207b3067a30-logs\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.610208 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.610522 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/554a6b70-9c9c-4afd-9738-d207b3067a30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.610573 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.610686 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.610769 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.713869 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/554a6b70-9c9c-4afd-9738-d207b3067a30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.713932 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714100 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714167 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714239 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714383 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/554a6b70-9c9c-4afd-9738-d207b3067a30-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714408 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmxvc\" (UniqueName: \"kubernetes.io/projected/554a6b70-9c9c-4afd-9738-d207b3067a30-kube-api-access-lmxvc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714558 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/554a6b70-9c9c-4afd-9738-d207b3067a30-logs\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714633 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.714771 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.718869 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/554a6b70-9c9c-4afd-9738-d207b3067a30-logs\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.719521 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.720614 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-scripts\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.722976 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-config-data\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.725716 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/554a6b70-9c9c-4afd-9738-d207b3067a30-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.737894 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmxvc\" (UniqueName: \"kubernetes.io/projected/554a6b70-9c9c-4afd-9738-d207b3067a30-kube-api-access-lmxvc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:22 crc kubenswrapper[4846]: I1122 09:33:22.755133 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"554a6b70-9c9c-4afd-9738-d207b3067a30\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:33:23 crc kubenswrapper[4846]: I1122 09:33:23.044568 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:23 crc kubenswrapper[4846]: I1122 09:33:23.297747 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85455dd3-3442-40ad-bd48-80034e877a41","Type":"ContainerStarted","Data":"43122817f12fb0567f0815369ccc4b4df28569d8e66e40eabce1a7775ff5efe8"} Nov 22 09:33:23 crc kubenswrapper[4846]: I1122 09:33:23.320912 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerStarted","Data":"8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41"} Nov 22 09:33:23 crc kubenswrapper[4846]: I1122 09:33:23.354403 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.3543659789999998 podStartE2EDuration="3.354365979s" podCreationTimestamp="2025-11-22 09:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:23.333638982 +0000 UTC m=+1178.269328631" watchObservedRunningTime="2025-11-22 09:33:23.354365979 +0000 UTC m=+1178.290055648" Nov 22 09:33:23 crc kubenswrapper[4846]: I1122 09:33:23.627777 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:33:23 crc kubenswrapper[4846]: W1122 09:33:23.648568 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod554a6b70_9c9c_4afd_9738_d207b3067a30.slice/crio-955f51b97bee86e9c035525145ca5c09ac298621e8a5f364c29daa4fb35ef52e WatchSource:0}: Error finding container 955f51b97bee86e9c035525145ca5c09ac298621e8a5f364c29daa4fb35ef52e: Status 404 returned error can't find the container with id 955f51b97bee86e9c035525145ca5c09ac298621e8a5f364c29daa4fb35ef52e Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.056014 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe5756d-eccc-4a1b-807b-7a5cd0962ea0" path="/var/lib/kubelet/pods/bbe5756d-eccc-4a1b-807b-7a5cd0962ea0/volumes" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.337556 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerStarted","Data":"1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258"} Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.338499 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.340539 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"554a6b70-9c9c-4afd-9738-d207b3067a30","Type":"ContainerStarted","Data":"1891b3b3772de517f2d43de7a894215d182ecb66936031369551f4ea2039ae5b"} Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.340584 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"554a6b70-9c9c-4afd-9738-d207b3067a30","Type":"ContainerStarted","Data":"955f51b97bee86e9c035525145ca5c09ac298621e8a5f364c29daa4fb35ef52e"} Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.389429 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.070313645 podStartE2EDuration="5.389401473s" podCreationTimestamp="2025-11-22 09:33:19 +0000 UTC" firstStartedPulling="2025-11-22 09:33:20.529186321 +0000 UTC m=+1175.464875970" lastFinishedPulling="2025-11-22 09:33:23.848274149 +0000 UTC m=+1178.783963798" observedRunningTime="2025-11-22 09:33:24.367141181 +0000 UTC m=+1179.302830830" watchObservedRunningTime="2025-11-22 09:33:24.389401473 +0000 UTC m=+1179.325091112" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.616367 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zkzjs"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.619413 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.640145 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zkzjs"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.668082 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-kube-api-access-vv5q5\") pod \"nova-api-db-create-zkzjs\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.668360 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-operator-scripts\") pod \"nova-api-db-create-zkzjs\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.733108 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tgvtl"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.734736 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.745061 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5847-account-create-qm7bq"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.746718 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.755103 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.757441 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgvtl"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.771681 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-operator-scripts\") pod \"nova-api-db-create-zkzjs\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.771809 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-kube-api-access-vv5q5\") pod \"nova-api-db-create-zkzjs\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.773062 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-operator-scripts\") pod \"nova-api-db-create-zkzjs\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.781103 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5847-account-create-qm7bq"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.807905 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-kube-api-access-vv5q5\") pod \"nova-api-db-create-zkzjs\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.873408 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8cq\" (UniqueName: \"kubernetes.io/projected/be978622-0c3f-41d3-b518-bbfcc1254b15-kube-api-access-dz8cq\") pod \"nova-api-5847-account-create-qm7bq\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.873757 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3ec915-c24b-4f67-8184-d21fd9a91e32-operator-scripts\") pod \"nova-cell0-db-create-tgvtl\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.873842 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vtjg\" (UniqueName: \"kubernetes.io/projected/3e3ec915-c24b-4f67-8184-d21fd9a91e32-kube-api-access-4vtjg\") pod \"nova-cell0-db-create-tgvtl\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.873866 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be978622-0c3f-41d3-b518-bbfcc1254b15-operator-scripts\") pod \"nova-api-5847-account-create-qm7bq\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.926109 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t4dm2"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.927513 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.947599 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.958439 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a996-account-create-rxphj"] Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.962838 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.976364 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.978822 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8279q\" (UniqueName: \"kubernetes.io/projected/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-kube-api-access-8279q\") pod \"nova-cell1-db-create-t4dm2\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.978944 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-operator-scripts\") pod \"nova-cell1-db-create-t4dm2\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.978994 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3ec915-c24b-4f67-8184-d21fd9a91e32-operator-scripts\") pod \"nova-cell0-db-create-tgvtl\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.979116 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vtjg\" (UniqueName: \"kubernetes.io/projected/3e3ec915-c24b-4f67-8184-d21fd9a91e32-kube-api-access-4vtjg\") pod \"nova-cell0-db-create-tgvtl\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.979149 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be978622-0c3f-41d3-b518-bbfcc1254b15-operator-scripts\") pod \"nova-api-5847-account-create-qm7bq\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.979200 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8cq\" (UniqueName: \"kubernetes.io/projected/be978622-0c3f-41d3-b518-bbfcc1254b15-kube-api-access-dz8cq\") pod \"nova-api-5847-account-create-qm7bq\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.980449 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3ec915-c24b-4f67-8184-d21fd9a91e32-operator-scripts\") pod \"nova-cell0-db-create-tgvtl\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:24 crc kubenswrapper[4846]: I1122 09:33:24.982216 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be978622-0c3f-41d3-b518-bbfcc1254b15-operator-scripts\") pod \"nova-api-5847-account-create-qm7bq\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.008378 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t4dm2"] Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.032260 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8cq\" (UniqueName: \"kubernetes.io/projected/be978622-0c3f-41d3-b518-bbfcc1254b15-kube-api-access-dz8cq\") pod \"nova-api-5847-account-create-qm7bq\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.032313 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vtjg\" (UniqueName: \"kubernetes.io/projected/3e3ec915-c24b-4f67-8184-d21fd9a91e32-kube-api-access-4vtjg\") pod \"nova-cell0-db-create-tgvtl\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.056225 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.069603 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a996-account-create-rxphj"] Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.076209 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.081187 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtv7\" (UniqueName: \"kubernetes.io/projected/150b9c82-e37f-41c2-a4ee-1578b73f9826-kube-api-access-fwtv7\") pod \"nova-cell0-a996-account-create-rxphj\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.081234 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8279q\" (UniqueName: \"kubernetes.io/projected/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-kube-api-access-8279q\") pod \"nova-cell1-db-create-t4dm2\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.081610 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/150b9c82-e37f-41c2-a4ee-1578b73f9826-operator-scripts\") pod \"nova-cell0-a996-account-create-rxphj\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.081721 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-operator-scripts\") pod \"nova-cell1-db-create-t4dm2\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.082579 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-operator-scripts\") pod \"nova-cell1-db-create-t4dm2\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.135353 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8279q\" (UniqueName: \"kubernetes.io/projected/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-kube-api-access-8279q\") pod \"nova-cell1-db-create-t4dm2\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.149634 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5dae-account-create-zzbjp"] Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.151730 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.156707 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.159985 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5dae-account-create-zzbjp"] Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.188330 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwtv7\" (UniqueName: \"kubernetes.io/projected/150b9c82-e37f-41c2-a4ee-1578b73f9826-kube-api-access-fwtv7\") pod \"nova-cell0-a996-account-create-rxphj\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.188476 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/150b9c82-e37f-41c2-a4ee-1578b73f9826-operator-scripts\") pod \"nova-cell0-a996-account-create-rxphj\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.191093 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/150b9c82-e37f-41c2-a4ee-1578b73f9826-operator-scripts\") pod \"nova-cell0-a996-account-create-rxphj\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.223217 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwtv7\" (UniqueName: \"kubernetes.io/projected/150b9c82-e37f-41c2-a4ee-1578b73f9826-kube-api-access-fwtv7\") pod \"nova-cell0-a996-account-create-rxphj\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.269508 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.301380 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8fm\" (UniqueName: \"kubernetes.io/projected/c8a48516-03fd-4e58-9f00-588f82223270-kube-api-access-6j8fm\") pod \"nova-cell1-5dae-account-create-zzbjp\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.301570 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a48516-03fd-4e58-9f00-588f82223270-operator-scripts\") pod \"nova-cell1-5dae-account-create-zzbjp\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.324461 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.363790 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"554a6b70-9c9c-4afd-9738-d207b3067a30","Type":"ContainerStarted","Data":"ddcdf6ed62802c98e94f316bd8d012e6fa8a64c07dbe6b1616d83c6096b08788"} Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.397802 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.397777725 podStartE2EDuration="3.397777725s" podCreationTimestamp="2025-11-22 09:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:25.393210611 +0000 UTC m=+1180.328900260" watchObservedRunningTime="2025-11-22 09:33:25.397777725 +0000 UTC m=+1180.333467374" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.404462 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a48516-03fd-4e58-9f00-588f82223270-operator-scripts\") pod \"nova-cell1-5dae-account-create-zzbjp\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.404565 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8fm\" (UniqueName: \"kubernetes.io/projected/c8a48516-03fd-4e58-9f00-588f82223270-kube-api-access-6j8fm\") pod \"nova-cell1-5dae-account-create-zzbjp\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.406192 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a48516-03fd-4e58-9f00-588f82223270-operator-scripts\") pod \"nova-cell1-5dae-account-create-zzbjp\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.422327 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8fm\" (UniqueName: \"kubernetes.io/projected/c8a48516-03fd-4e58-9f00-588f82223270-kube-api-access-6j8fm\") pod \"nova-cell1-5dae-account-create-zzbjp\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.503943 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.583064 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zkzjs"] Nov 22 09:33:25 crc kubenswrapper[4846]: W1122 09:33:25.683196 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod483e6e5b_838d_4fc3_ae1a_82ac6ba13439.slice/crio-b6b4bac49a00f94937b38b579b1a98bb1c23b8205d66fd55be2948e69764d91a WatchSource:0}: Error finding container b6b4bac49a00f94937b38b579b1a98bb1c23b8205d66fd55be2948e69764d91a: Status 404 returned error can't find the container with id b6b4bac49a00f94937b38b579b1a98bb1c23b8205d66fd55be2948e69764d91a Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.756479 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgvtl"] Nov 22 09:33:25 crc kubenswrapper[4846]: I1122 09:33:25.873111 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5847-account-create-qm7bq"] Nov 22 09:33:25 crc kubenswrapper[4846]: W1122 09:33:25.917221 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe978622_0c3f_41d3_b518_bbfcc1254b15.slice/crio-31bbff9f5c99744a1719ce8262e28c2214536160f1392ae45a15e1d0f9ef4cec WatchSource:0}: Error finding container 31bbff9f5c99744a1719ce8262e28c2214536160f1392ae45a15e1d0f9ef4cec: Status 404 returned error can't find the container with id 31bbff9f5c99744a1719ce8262e28c2214536160f1392ae45a15e1d0f9ef4cec Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.115200 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a996-account-create-rxphj"] Nov 22 09:33:26 crc kubenswrapper[4846]: W1122 09:33:26.122873 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod150b9c82_e37f_41c2_a4ee_1578b73f9826.slice/crio-82bcf0bc7c8101112cb993c3fa5e83e4ad34ed32ae3f445541d19b5e5e2b99f7 WatchSource:0}: Error finding container 82bcf0bc7c8101112cb993c3fa5e83e4ad34ed32ae3f445541d19b5e5e2b99f7: Status 404 returned error can't find the container with id 82bcf0bc7c8101112cb993c3fa5e83e4ad34ed32ae3f445541d19b5e5e2b99f7 Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.170745 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t4dm2"] Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.348514 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5dae-account-create-zzbjp"] Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.384645 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a996-account-create-rxphj" event={"ID":"150b9c82-e37f-41c2-a4ee-1578b73f9826","Type":"ContainerStarted","Data":"b085c9e72a5a9368b7096ba50e7fb2e6bf539651f2c8555c62bed68842e0a655"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.384733 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a996-account-create-rxphj" event={"ID":"150b9c82-e37f-41c2-a4ee-1578b73f9826","Type":"ContainerStarted","Data":"82bcf0bc7c8101112cb993c3fa5e83e4ad34ed32ae3f445541d19b5e5e2b99f7"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.395825 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5dae-account-create-zzbjp" event={"ID":"c8a48516-03fd-4e58-9f00-588f82223270","Type":"ContainerStarted","Data":"d0b7f6f1159adc9b3fdf4ef37839e509fb52ffd1da1f6f4478963fb5ba73b234"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.404315 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgvtl" event={"ID":"3e3ec915-c24b-4f67-8184-d21fd9a91e32","Type":"ContainerStarted","Data":"d36e2b03a65a697f6a604e9a2e66a36e5e023a0a5fd2d3c62c6bd4a5f4ed1047"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.404369 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgvtl" event={"ID":"3e3ec915-c24b-4f67-8184-d21fd9a91e32","Type":"ContainerStarted","Data":"72540002dc258f9b258621c5f66276ea7925f89d2c5810bf1e3bd98c82ccb4c7"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.406821 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zkzjs" event={"ID":"483e6e5b-838d-4fc3-ae1a-82ac6ba13439","Type":"ContainerStarted","Data":"d1d8d349e5418a002194dd19912a535461117ebfae5f89e97531b0202ba7740b"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.406848 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zkzjs" event={"ID":"483e6e5b-838d-4fc3-ae1a-82ac6ba13439","Type":"ContainerStarted","Data":"b6b4bac49a00f94937b38b579b1a98bb1c23b8205d66fd55be2948e69764d91a"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.413529 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5847-account-create-qm7bq" event={"ID":"be978622-0c3f-41d3-b518-bbfcc1254b15","Type":"ContainerStarted","Data":"73eeb0ca2e6395869514fa1329b6a4d59c6cfe218f8cc81f9efaeeee170bbc6b"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.413568 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5847-account-create-qm7bq" event={"ID":"be978622-0c3f-41d3-b518-bbfcc1254b15","Type":"ContainerStarted","Data":"31bbff9f5c99744a1719ce8262e28c2214536160f1392ae45a15e1d0f9ef4cec"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.424314 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t4dm2" event={"ID":"a7b4db2e-84b4-474a-90d0-0b9ee78e122f","Type":"ContainerStarted","Data":"15b768f9e5b5689a99821d2627ea76f4efcb0c822d2151f661a0d131fe14cd7a"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.424422 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t4dm2" event={"ID":"a7b4db2e-84b4-474a-90d0-0b9ee78e122f","Type":"ContainerStarted","Data":"48adf5eb51112b2b876d9384c429cc779aa11e4c2129746ca85338458a6c1c7d"} Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.431720 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-a996-account-create-rxphj" podStartSLOduration=2.431694596 podStartE2EDuration="2.431694596s" podCreationTimestamp="2025-11-22 09:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:26.405598051 +0000 UTC m=+1181.341287700" watchObservedRunningTime="2025-11-22 09:33:26.431694596 +0000 UTC m=+1181.367384245" Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.432224 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-tgvtl" podStartSLOduration=2.432215171 podStartE2EDuration="2.432215171s" podCreationTimestamp="2025-11-22 09:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:26.422689592 +0000 UTC m=+1181.358379251" watchObservedRunningTime="2025-11-22 09:33:26.432215171 +0000 UTC m=+1181.367904820" Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.454308 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5847-account-create-qm7bq" podStartSLOduration=2.454285938 podStartE2EDuration="2.454285938s" podCreationTimestamp="2025-11-22 09:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:26.443312786 +0000 UTC m=+1181.379002435" watchObservedRunningTime="2025-11-22 09:33:26.454285938 +0000 UTC m=+1181.389975587" Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.466931 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-zkzjs" podStartSLOduration=2.466914868 podStartE2EDuration="2.466914868s" podCreationTimestamp="2025-11-22 09:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:26.461218481 +0000 UTC m=+1181.396908120" watchObservedRunningTime="2025-11-22 09:33:26.466914868 +0000 UTC m=+1181.402604517" Nov 22 09:33:26 crc kubenswrapper[4846]: I1122 09:33:26.530531 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-t4dm2" podStartSLOduration=2.53049983 podStartE2EDuration="2.53049983s" podCreationTimestamp="2025-11-22 09:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:33:26.488420538 +0000 UTC m=+1181.424110187" watchObservedRunningTime="2025-11-22 09:33:26.53049983 +0000 UTC m=+1181.466189479" Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.441494 4846 generic.go:334] "Generic (PLEG): container finished" podID="c8a48516-03fd-4e58-9f00-588f82223270" containerID="3d25922e03648ba50de7206628b4a3773f88136364b8d7161edab37f45f8e5d9" exitCode=0 Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.441614 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5dae-account-create-zzbjp" event={"ID":"c8a48516-03fd-4e58-9f00-588f82223270","Type":"ContainerDied","Data":"3d25922e03648ba50de7206628b4a3773f88136364b8d7161edab37f45f8e5d9"} Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.445619 4846 generic.go:334] "Generic (PLEG): container finished" podID="3e3ec915-c24b-4f67-8184-d21fd9a91e32" containerID="d36e2b03a65a697f6a604e9a2e66a36e5e023a0a5fd2d3c62c6bd4a5f4ed1047" exitCode=0 Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.445690 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgvtl" event={"ID":"3e3ec915-c24b-4f67-8184-d21fd9a91e32","Type":"ContainerDied","Data":"d36e2b03a65a697f6a604e9a2e66a36e5e023a0a5fd2d3c62c6bd4a5f4ed1047"} Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.448625 4846 generic.go:334] "Generic (PLEG): container finished" podID="483e6e5b-838d-4fc3-ae1a-82ac6ba13439" containerID="d1d8d349e5418a002194dd19912a535461117ebfae5f89e97531b0202ba7740b" exitCode=0 Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.448665 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zkzjs" event={"ID":"483e6e5b-838d-4fc3-ae1a-82ac6ba13439","Type":"ContainerDied","Data":"d1d8d349e5418a002194dd19912a535461117ebfae5f89e97531b0202ba7740b"} Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.451804 4846 generic.go:334] "Generic (PLEG): container finished" podID="be978622-0c3f-41d3-b518-bbfcc1254b15" containerID="73eeb0ca2e6395869514fa1329b6a4d59c6cfe218f8cc81f9efaeeee170bbc6b" exitCode=0 Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.451841 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5847-account-create-qm7bq" event={"ID":"be978622-0c3f-41d3-b518-bbfcc1254b15","Type":"ContainerDied","Data":"73eeb0ca2e6395869514fa1329b6a4d59c6cfe218f8cc81f9efaeeee170bbc6b"} Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.455167 4846 generic.go:334] "Generic (PLEG): container finished" podID="a7b4db2e-84b4-474a-90d0-0b9ee78e122f" containerID="15b768f9e5b5689a99821d2627ea76f4efcb0c822d2151f661a0d131fe14cd7a" exitCode=0 Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.455240 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t4dm2" event={"ID":"a7b4db2e-84b4-474a-90d0-0b9ee78e122f","Type":"ContainerDied","Data":"15b768f9e5b5689a99821d2627ea76f4efcb0c822d2151f661a0d131fe14cd7a"} Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.458103 4846 generic.go:334] "Generic (PLEG): container finished" podID="150b9c82-e37f-41c2-a4ee-1578b73f9826" containerID="b085c9e72a5a9368b7096ba50e7fb2e6bf539651f2c8555c62bed68842e0a655" exitCode=0 Nov 22 09:33:27 crc kubenswrapper[4846]: I1122 09:33:27.458202 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a996-account-create-rxphj" event={"ID":"150b9c82-e37f-41c2-a4ee-1578b73f9826","Type":"ContainerDied","Data":"b085c9e72a5a9368b7096ba50e7fb2e6bf539651f2c8555c62bed68842e0a655"} Nov 22 09:33:28 crc kubenswrapper[4846]: I1122 09:33:28.625563 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:33:28 crc kubenswrapper[4846]: I1122 09:33:28.626194 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:33:28 crc kubenswrapper[4846]: I1122 09:33:28.626261 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:33:28 crc kubenswrapper[4846]: I1122 09:33:28.627528 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf9936e32ada96756d2d63284f53d35f1bafde25a492c2c86fd57715fcf497eb"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:33:28 crc kubenswrapper[4846]: I1122 09:33:28.627601 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://cf9936e32ada96756d2d63284f53d35f1bafde25a492c2c86fd57715fcf497eb" gracePeriod=600 Nov 22 09:33:28 crc kubenswrapper[4846]: I1122 09:33:28.958265 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.018860 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vtjg\" (UniqueName: \"kubernetes.io/projected/3e3ec915-c24b-4f67-8184-d21fd9a91e32-kube-api-access-4vtjg\") pod \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.019173 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3ec915-c24b-4f67-8184-d21fd9a91e32-operator-scripts\") pod \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\" (UID: \"3e3ec915-c24b-4f67-8184-d21fd9a91e32\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.022968 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3ec915-c24b-4f67-8184-d21fd9a91e32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e3ec915-c24b-4f67-8184-d21fd9a91e32" (UID: "3e3ec915-c24b-4f67-8184-d21fd9a91e32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.034304 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3ec915-c24b-4f67-8184-d21fd9a91e32-kube-api-access-4vtjg" (OuterVolumeSpecName: "kube-api-access-4vtjg") pod "3e3ec915-c24b-4f67-8184-d21fd9a91e32" (UID: "3e3ec915-c24b-4f67-8184-d21fd9a91e32"). InnerVolumeSpecName "kube-api-access-4vtjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.124321 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e3ec915-c24b-4f67-8184-d21fd9a91e32-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.124350 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vtjg\" (UniqueName: \"kubernetes.io/projected/3e3ec915-c24b-4f67-8184-d21fd9a91e32-kube-api-access-4vtjg\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.226337 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.239914 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.255854 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.286189 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.296208 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328058 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j8fm\" (UniqueName: \"kubernetes.io/projected/c8a48516-03fd-4e58-9f00-588f82223270-kube-api-access-6j8fm\") pod \"c8a48516-03fd-4e58-9f00-588f82223270\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328269 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz8cq\" (UniqueName: \"kubernetes.io/projected/be978622-0c3f-41d3-b518-bbfcc1254b15-kube-api-access-dz8cq\") pod \"be978622-0c3f-41d3-b518-bbfcc1254b15\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328351 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be978622-0c3f-41d3-b518-bbfcc1254b15-operator-scripts\") pod \"be978622-0c3f-41d3-b518-bbfcc1254b15\" (UID: \"be978622-0c3f-41d3-b518-bbfcc1254b15\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328447 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-operator-scripts\") pod \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328492 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-operator-scripts\") pod \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328639 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a48516-03fd-4e58-9f00-588f82223270-operator-scripts\") pod \"c8a48516-03fd-4e58-9f00-588f82223270\" (UID: \"c8a48516-03fd-4e58-9f00-588f82223270\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328683 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-kube-api-access-vv5q5\") pod \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\" (UID: \"483e6e5b-838d-4fc3-ae1a-82ac6ba13439\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328714 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/150b9c82-e37f-41c2-a4ee-1578b73f9826-operator-scripts\") pod \"150b9c82-e37f-41c2-a4ee-1578b73f9826\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328746 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwtv7\" (UniqueName: \"kubernetes.io/projected/150b9c82-e37f-41c2-a4ee-1578b73f9826-kube-api-access-fwtv7\") pod \"150b9c82-e37f-41c2-a4ee-1578b73f9826\" (UID: \"150b9c82-e37f-41c2-a4ee-1578b73f9826\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.328784 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8279q\" (UniqueName: \"kubernetes.io/projected/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-kube-api-access-8279q\") pod \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\" (UID: \"a7b4db2e-84b4-474a-90d0-0b9ee78e122f\") " Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.330895 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7b4db2e-84b4-474a-90d0-0b9ee78e122f" (UID: "a7b4db2e-84b4-474a-90d0-0b9ee78e122f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.333449 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a48516-03fd-4e58-9f00-588f82223270-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8a48516-03fd-4e58-9f00-588f82223270" (UID: "c8a48516-03fd-4e58-9f00-588f82223270"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.336107 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a48516-03fd-4e58-9f00-588f82223270-kube-api-access-6j8fm" (OuterVolumeSpecName: "kube-api-access-6j8fm") pod "c8a48516-03fd-4e58-9f00-588f82223270" (UID: "c8a48516-03fd-4e58-9f00-588f82223270"). InnerVolumeSpecName "kube-api-access-6j8fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.336638 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/150b9c82-e37f-41c2-a4ee-1578b73f9826-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "150b9c82-e37f-41c2-a4ee-1578b73f9826" (UID: "150b9c82-e37f-41c2-a4ee-1578b73f9826"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.337058 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be978622-0c3f-41d3-b518-bbfcc1254b15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be978622-0c3f-41d3-b518-bbfcc1254b15" (UID: "be978622-0c3f-41d3-b518-bbfcc1254b15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.337732 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be978622-0c3f-41d3-b518-bbfcc1254b15-kube-api-access-dz8cq" (OuterVolumeSpecName: "kube-api-access-dz8cq") pod "be978622-0c3f-41d3-b518-bbfcc1254b15" (UID: "be978622-0c3f-41d3-b518-bbfcc1254b15"). InnerVolumeSpecName "kube-api-access-dz8cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.338015 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "483e6e5b-838d-4fc3-ae1a-82ac6ba13439" (UID: "483e6e5b-838d-4fc3-ae1a-82ac6ba13439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.346481 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150b9c82-e37f-41c2-a4ee-1578b73f9826-kube-api-access-fwtv7" (OuterVolumeSpecName: "kube-api-access-fwtv7") pod "150b9c82-e37f-41c2-a4ee-1578b73f9826" (UID: "150b9c82-e37f-41c2-a4ee-1578b73f9826"). InnerVolumeSpecName "kube-api-access-fwtv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.348409 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-kube-api-access-8279q" (OuterVolumeSpecName: "kube-api-access-8279q") pod "a7b4db2e-84b4-474a-90d0-0b9ee78e122f" (UID: "a7b4db2e-84b4-474a-90d0-0b9ee78e122f"). InnerVolumeSpecName "kube-api-access-8279q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.354407 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-kube-api-access-vv5q5" (OuterVolumeSpecName: "kube-api-access-vv5q5") pod "483e6e5b-838d-4fc3-ae1a-82ac6ba13439" (UID: "483e6e5b-838d-4fc3-ae1a-82ac6ba13439"). InnerVolumeSpecName "kube-api-access-vv5q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432182 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz8cq\" (UniqueName: \"kubernetes.io/projected/be978622-0c3f-41d3-b518-bbfcc1254b15-kube-api-access-dz8cq\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432255 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be978622-0c3f-41d3-b518-bbfcc1254b15-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432271 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432283 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432296 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a48516-03fd-4e58-9f00-588f82223270-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432310 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv5q5\" (UniqueName: \"kubernetes.io/projected/483e6e5b-838d-4fc3-ae1a-82ac6ba13439-kube-api-access-vv5q5\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432322 4846 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/150b9c82-e37f-41c2-a4ee-1578b73f9826-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432335 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwtv7\" (UniqueName: \"kubernetes.io/projected/150b9c82-e37f-41c2-a4ee-1578b73f9826-kube-api-access-fwtv7\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432347 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8279q\" (UniqueName: \"kubernetes.io/projected/a7b4db2e-84b4-474a-90d0-0b9ee78e122f-kube-api-access-8279q\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.432359 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j8fm\" (UniqueName: \"kubernetes.io/projected/c8a48516-03fd-4e58-9f00-588f82223270-kube-api-access-6j8fm\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.480261 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a996-account-create-rxphj" event={"ID":"150b9c82-e37f-41c2-a4ee-1578b73f9826","Type":"ContainerDied","Data":"82bcf0bc7c8101112cb993c3fa5e83e4ad34ed32ae3f445541d19b5e5e2b99f7"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.480345 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82bcf0bc7c8101112cb993c3fa5e83e4ad34ed32ae3f445541d19b5e5e2b99f7" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.480352 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a996-account-create-rxphj" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.485578 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5dae-account-create-zzbjp" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.485580 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5dae-account-create-zzbjp" event={"ID":"c8a48516-03fd-4e58-9f00-588f82223270","Type":"ContainerDied","Data":"d0b7f6f1159adc9b3fdf4ef37839e509fb52ffd1da1f6f4478963fb5ba73b234"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.485839 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b7f6f1159adc9b3fdf4ef37839e509fb52ffd1da1f6f4478963fb5ba73b234" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.490585 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="cf9936e32ada96756d2d63284f53d35f1bafde25a492c2c86fd57715fcf497eb" exitCode=0 Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.490649 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"cf9936e32ada96756d2d63284f53d35f1bafde25a492c2c86fd57715fcf497eb"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.490685 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"a14897a35386470f39071e84723014ffd191c85c1c0f4368970f8ed940d4ab69"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.490709 4846 scope.go:117] "RemoveContainer" containerID="bac90441ca960230e02742d36a5b95524d70b371a6ee7e32b617df01413fca78" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.497127 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgvtl" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.497143 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgvtl" event={"ID":"3e3ec915-c24b-4f67-8184-d21fd9a91e32","Type":"ContainerDied","Data":"72540002dc258f9b258621c5f66276ea7925f89d2c5810bf1e3bd98c82ccb4c7"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.497298 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72540002dc258f9b258621c5f66276ea7925f89d2c5810bf1e3bd98c82ccb4c7" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.499356 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zkzjs" event={"ID":"483e6e5b-838d-4fc3-ae1a-82ac6ba13439","Type":"ContainerDied","Data":"b6b4bac49a00f94937b38b579b1a98bb1c23b8205d66fd55be2948e69764d91a"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.499420 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b4bac49a00f94937b38b579b1a98bb1c23b8205d66fd55be2948e69764d91a" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.499394 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zkzjs" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.501973 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5847-account-create-qm7bq" event={"ID":"be978622-0c3f-41d3-b518-bbfcc1254b15","Type":"ContainerDied","Data":"31bbff9f5c99744a1719ce8262e28c2214536160f1392ae45a15e1d0f9ef4cec"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.502005 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31bbff9f5c99744a1719ce8262e28c2214536160f1392ae45a15e1d0f9ef4cec" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.502135 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5847-account-create-qm7bq" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.507521 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t4dm2" event={"ID":"a7b4db2e-84b4-474a-90d0-0b9ee78e122f","Type":"ContainerDied","Data":"48adf5eb51112b2b876d9384c429cc779aa11e4c2129746ca85338458a6c1c7d"} Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.507583 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48adf5eb51112b2b876d9384c429cc779aa11e4c2129746ca85338458a6c1c7d" Nov 22 09:33:29 crc kubenswrapper[4846]: I1122 09:33:29.507661 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t4dm2" Nov 22 09:33:30 crc kubenswrapper[4846]: I1122 09:33:30.633466 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 09:33:30 crc kubenswrapper[4846]: I1122 09:33:30.635786 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 09:33:30 crc kubenswrapper[4846]: I1122 09:33:30.679016 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 09:33:30 crc kubenswrapper[4846]: I1122 09:33:30.692606 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 09:33:31 crc kubenswrapper[4846]: I1122 09:33:31.531749 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:33:31 crc kubenswrapper[4846]: I1122 09:33:31.532174 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.046099 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.046668 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.101248 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.117934 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.550163 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.550200 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.619595 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.620118 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:33:33 crc kubenswrapper[4846]: I1122 09:33:33.622681 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.566469 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.567302 4846 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.665116 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944222 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fzdp"] Nov 22 09:33:35 crc kubenswrapper[4846]: E1122 09:33:35.944622 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be978622-0c3f-41d3-b518-bbfcc1254b15" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944645 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="be978622-0c3f-41d3-b518-bbfcc1254b15" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: E1122 09:33:35.944667 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3ec915-c24b-4f67-8184-d21fd9a91e32" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944673 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3ec915-c24b-4f67-8184-d21fd9a91e32" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: E1122 09:33:35.944684 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a48516-03fd-4e58-9f00-588f82223270" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944691 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a48516-03fd-4e58-9f00-588f82223270" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: E1122 09:33:35.944718 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150b9c82-e37f-41c2-a4ee-1578b73f9826" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944724 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="150b9c82-e37f-41c2-a4ee-1578b73f9826" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: E1122 09:33:35.944733 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483e6e5b-838d-4fc3-ae1a-82ac6ba13439" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944739 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="483e6e5b-838d-4fc3-ae1a-82ac6ba13439" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: E1122 09:33:35.944765 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b4db2e-84b4-474a-90d0-0b9ee78e122f" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944772 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b4db2e-84b4-474a-90d0-0b9ee78e122f" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944954 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="150b9c82-e37f-41c2-a4ee-1578b73f9826" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944968 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a48516-03fd-4e58-9f00-588f82223270" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944979 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="be978622-0c3f-41d3-b518-bbfcc1254b15" containerName="mariadb-account-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944990 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3ec915-c24b-4f67-8184-d21fd9a91e32" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.944999 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="483e6e5b-838d-4fc3-ae1a-82ac6ba13439" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.945009 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b4db2e-84b4-474a-90d0-0b9ee78e122f" containerName="mariadb-database-create" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.945680 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.950995 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.951621 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pt9t4" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.951762 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 09:33:35 crc kubenswrapper[4846]: I1122 09:33:35.969043 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fzdp"] Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.094691 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-scripts\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.094781 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-config-data\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.094808 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xgt\" (UniqueName: \"kubernetes.io/projected/e9154a2c-895f-4ef4-921a-08305d1f8c4f-kube-api-access-p6xgt\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.094935 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.197589 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.197759 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-scripts\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.197790 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-config-data\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.197810 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xgt\" (UniqueName: \"kubernetes.io/projected/e9154a2c-895f-4ef4-921a-08305d1f8c4f-kube-api-access-p6xgt\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.207239 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.208091 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.209004 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-config-data\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.216922 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-scripts\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.221669 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xgt\" (UniqueName: \"kubernetes.io/projected/e9154a2c-895f-4ef4-921a-08305d1f8c4f-kube-api-access-p6xgt\") pod \"nova-cell0-conductor-db-sync-8fzdp\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.301590 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:36 crc kubenswrapper[4846]: I1122 09:33:36.858125 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fzdp"] Nov 22 09:33:37 crc kubenswrapper[4846]: I1122 09:33:37.597892 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" event={"ID":"e9154a2c-895f-4ef4-921a-08305d1f8c4f","Type":"ContainerStarted","Data":"b2f66e3d8b29f0c2648593b3ae6aaeed881a0dba08e137dc392f57dfd1678c8d"} Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.275847 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.277312 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-central-agent" containerID="cri-o://55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c" gracePeriod=30 Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.278779 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="proxy-httpd" containerID="cri-o://1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258" gracePeriod=30 Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.278893 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-notification-agent" containerID="cri-o://3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713" gracePeriod=30 Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.278990 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="sg-core" containerID="cri-o://8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41" gracePeriod=30 Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.286778 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.696463 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" event={"ID":"e9154a2c-895f-4ef4-921a-08305d1f8c4f","Type":"ContainerStarted","Data":"e4eff862f8b641426e53f1112558ad1879bdfa977c7fed124e3a5fdf0376720b"} Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.704689 4846 generic.go:334] "Generic (PLEG): container finished" podID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerID="1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258" exitCode=0 Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.704721 4846 generic.go:334] "Generic (PLEG): container finished" podID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerID="8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41" exitCode=2 Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.704744 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerDied","Data":"1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258"} Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.704768 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerDied","Data":"8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41"} Nov 22 09:33:45 crc kubenswrapper[4846]: I1122 09:33:45.721354 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" podStartSLOduration=2.778448088 podStartE2EDuration="10.721332748s" podCreationTimestamp="2025-11-22 09:33:35 +0000 UTC" firstStartedPulling="2025-11-22 09:33:36.866334986 +0000 UTC m=+1191.802024635" lastFinishedPulling="2025-11-22 09:33:44.809219636 +0000 UTC m=+1199.744909295" observedRunningTime="2025-11-22 09:33:45.715022613 +0000 UTC m=+1200.650712302" watchObservedRunningTime="2025-11-22 09:33:45.721332748 +0000 UTC m=+1200.657022397" Nov 22 09:33:46 crc kubenswrapper[4846]: I1122 09:33:46.720786 4846 generic.go:334] "Generic (PLEG): container finished" podID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerID="55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c" exitCode=0 Nov 22 09:33:46 crc kubenswrapper[4846]: I1122 09:33:46.720859 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerDied","Data":"55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c"} Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.084201 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.085004 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="340633f3-603b-416e-924a-2938adbde84f" containerName="kube-state-metrics" containerID="cri-o://2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e" gracePeriod=30 Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.461772 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.550290 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.633948 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-sg-core-conf-yaml\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634219 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-run-httpd\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634245 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-config-data\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634408 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-scripts\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634437 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-combined-ca-bundle\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634465 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-log-httpd\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634494 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8fxk\" (UniqueName: \"kubernetes.io/projected/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-kube-api-access-f8fxk\") pod \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\" (UID: \"1ed1b3a1-34e5-4a77-961a-30c5ba68b180\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.634799 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.635471 4846 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.635632 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.641956 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-scripts" (OuterVolumeSpecName: "scripts") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.644776 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-kube-api-access-f8fxk" (OuterVolumeSpecName: "kube-api-access-f8fxk") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "kube-api-access-f8fxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.667370 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.724077 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.737259 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2hk\" (UniqueName: \"kubernetes.io/projected/340633f3-603b-416e-924a-2938adbde84f-kube-api-access-sn2hk\") pod \"340633f3-603b-416e-924a-2938adbde84f\" (UID: \"340633f3-603b-416e-924a-2938adbde84f\") " Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.737818 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.737842 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.737857 4846 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.737868 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8fxk\" (UniqueName: \"kubernetes.io/projected/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-kube-api-access-f8fxk\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.737877 4846 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.740937 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340633f3-603b-416e-924a-2938adbde84f-kube-api-access-sn2hk" (OuterVolumeSpecName: "kube-api-access-sn2hk") pod "340633f3-603b-416e-924a-2938adbde84f" (UID: "340633f3-603b-416e-924a-2938adbde84f"). InnerVolumeSpecName "kube-api-access-sn2hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.763496 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-config-data" (OuterVolumeSpecName: "config-data") pod "1ed1b3a1-34e5-4a77-961a-30c5ba68b180" (UID: "1ed1b3a1-34e5-4a77-961a-30c5ba68b180"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.766921 4846 generic.go:334] "Generic (PLEG): container finished" podID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerID="3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713" exitCode=0 Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.767004 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerDied","Data":"3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713"} Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.767034 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed1b3a1-34e5-4a77-961a-30c5ba68b180","Type":"ContainerDied","Data":"4c7ba19d6704445a62b2f640fd330cf0d3ebb30d5fd8627cd698d73cb9b40c20"} Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.767070 4846 scope.go:117] "RemoveContainer" containerID="1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.767213 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.775645 4846 generic.go:334] "Generic (PLEG): container finished" podID="340633f3-603b-416e-924a-2938adbde84f" containerID="2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e" exitCode=2 Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.775715 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"340633f3-603b-416e-924a-2938adbde84f","Type":"ContainerDied","Data":"2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e"} Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.775761 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"340633f3-603b-416e-924a-2938adbde84f","Type":"ContainerDied","Data":"ea76cfa65bd78a07f8f76760e067eb45801acc4be042c74a50f835f838c4b8a5"} Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.775847 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.805419 4846 scope.go:117] "RemoveContainer" containerID="8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.817527 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.829401 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.837948 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.839655 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2hk\" (UniqueName: \"kubernetes.io/projected/340633f3-603b-416e-924a-2938adbde84f-kube-api-access-sn2hk\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.839683 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed1b3a1-34e5-4a77-961a-30c5ba68b180-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.847168 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.847495 4846 scope.go:117] "RemoveContainer" containerID="3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.864939 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.865646 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340633f3-603b-416e-924a-2938adbde84f" containerName="kube-state-metrics" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.865696 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="340633f3-603b-416e-924a-2938adbde84f" containerName="kube-state-metrics" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.865713 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="sg-core" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.865721 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="sg-core" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.865745 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-central-agent" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.865778 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-central-agent" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.865798 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="proxy-httpd" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.865810 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="proxy-httpd" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.865867 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-notification-agent" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.865879 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-notification-agent" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.866241 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-central-agent" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.866268 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="340633f3-603b-416e-924a-2938adbde84f" containerName="kube-state-metrics" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.866280 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="sg-core" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.866310 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="ceilometer-notification-agent" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.866326 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" containerName="proxy-httpd" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.868995 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.873664 4846 scope.go:117] "RemoveContainer" containerID="55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.873767 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.873966 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.874240 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gcfwz" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.879084 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.880638 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.883820 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.884033 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.903073 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.909878 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.924979 4846 scope.go:117] "RemoveContainer" containerID="1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.926675 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258\": container with ID starting with 1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258 not found: ID does not exist" containerID="1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.926725 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258"} err="failed to get container status \"1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258\": rpc error: code = NotFound desc = could not find container \"1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258\": container with ID starting with 1ed970b68f7b8e65d8a81215b3c4c6d6402ae21cc1ba4fc5f82e38ab91c0a258 not found: ID does not exist" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.926750 4846 scope.go:117] "RemoveContainer" containerID="8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.927459 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41\": container with ID starting with 8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41 not found: ID does not exist" containerID="8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.927507 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41"} err="failed to get container status \"8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41\": rpc error: code = NotFound desc = could not find container \"8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41\": container with ID starting with 8c9eb766e1d948d759cec3cc28b529ee49ccba64b479c7b8c4c3d1922cba3e41 not found: ID does not exist" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.927524 4846 scope.go:117] "RemoveContainer" containerID="3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.929817 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713\": container with ID starting with 3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713 not found: ID does not exist" containerID="3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.929875 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713"} err="failed to get container status \"3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713\": rpc error: code = NotFound desc = could not find container \"3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713\": container with ID starting with 3f4d6ad67c6ebd3a4acf139a1bd7263820b692324f2d91f2fa875f11d5cd6713 not found: ID does not exist" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.929912 4846 scope.go:117] "RemoveContainer" containerID="55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.934032 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c\": container with ID starting with 55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c not found: ID does not exist" containerID="55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.934094 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c"} err="failed to get container status \"55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c\": rpc error: code = NotFound desc = could not find container \"55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c\": container with ID starting with 55faf514d618c2dd0bd2d6307f4bcc78c1cececc655f52f0a08277810d54b31c not found: ID does not exist" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.934127 4846 scope.go:117] "RemoveContainer" containerID="2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.961419 4846 scope.go:117] "RemoveContainer" containerID="2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e" Nov 22 09:33:49 crc kubenswrapper[4846]: E1122 09:33:49.962171 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e\": container with ID starting with 2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e not found: ID does not exist" containerID="2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e" Nov 22 09:33:49 crc kubenswrapper[4846]: I1122 09:33:49.962217 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e"} err="failed to get container status \"2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e\": rpc error: code = NotFound desc = could not find container \"2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e\": container with ID starting with 2568f957bcb776b3ecd18f3408669d1552fe21ee38d1bda6352d168bcd9ecb6e not found: ID does not exist" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045422 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045470 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045491 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045531 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-config-data\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045556 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-log-httpd\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045572 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4vtf\" (UniqueName: \"kubernetes.io/projected/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-api-access-d4vtf\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045602 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnpc\" (UniqueName: \"kubernetes.io/projected/0fd320e6-1b20-4a5d-890c-91e19f17f29c-kube-api-access-zqnpc\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045690 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045732 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045782 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-scripts\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.045809 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-run-httpd\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.053303 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed1b3a1-34e5-4a77-961a-30c5ba68b180" path="/var/lib/kubelet/pods/1ed1b3a1-34e5-4a77-961a-30c5ba68b180/volumes" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.054128 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340633f3-603b-416e-924a-2938adbde84f" path="/var/lib/kubelet/pods/340633f3-603b-416e-924a-2938adbde84f/volumes" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.147989 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148091 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-scripts\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148121 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-run-httpd\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148214 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148241 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148267 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148300 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-config-data\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148321 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-log-httpd\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148340 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4vtf\" (UniqueName: \"kubernetes.io/projected/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-api-access-d4vtf\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148367 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnpc\" (UniqueName: \"kubernetes.io/projected/0fd320e6-1b20-4a5d-890c-91e19f17f29c-kube-api-access-zqnpc\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.148384 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.150288 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-run-httpd\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.151490 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-log-httpd\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.156474 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-config-data\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.156728 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.157240 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.157470 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.158838 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-scripts\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.165353 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.168593 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.170008 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4vtf\" (UniqueName: \"kubernetes.io/projected/4377a3fa-e17a-42e4-ab0b-37f76e90dbf9-kube-api-access-d4vtf\") pod \"kube-state-metrics-0\" (UID: \"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9\") " pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.178661 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnpc\" (UniqueName: \"kubernetes.io/projected/0fd320e6-1b20-4a5d-890c-91e19f17f29c-kube-api-access-zqnpc\") pod \"ceilometer-0\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.224421 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.243564 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.626085 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 09:33:50 crc kubenswrapper[4846]: W1122 09:33:50.632151 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4377a3fa_e17a_42e4_ab0b_37f76e90dbf9.slice/crio-51c8669600c6bb0734b98d7d4be2e3e6b860fef81bdd9e04c46b695ec06afb02 WatchSource:0}: Error finding container 51c8669600c6bb0734b98d7d4be2e3e6b860fef81bdd9e04c46b695ec06afb02: Status 404 returned error can't find the container with id 51c8669600c6bb0734b98d7d4be2e3e6b860fef81bdd9e04c46b695ec06afb02 Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.796817 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9","Type":"ContainerStarted","Data":"51c8669600c6bb0734b98d7d4be2e3e6b860fef81bdd9e04c46b695ec06afb02"} Nov 22 09:33:50 crc kubenswrapper[4846]: I1122 09:33:50.841067 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:51 crc kubenswrapper[4846]: I1122 09:33:51.227393 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:33:51 crc kubenswrapper[4846]: I1122 09:33:51.811331 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerStarted","Data":"576c99669820ab55d3dd3353cfb7894b277fe9869e0f040140ddeaf9458dd9df"} Nov 22 09:33:52 crc kubenswrapper[4846]: I1122 09:33:52.832740 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4377a3fa-e17a-42e4-ab0b-37f76e90dbf9","Type":"ContainerStarted","Data":"be7513b13709fd27a844a3f4c3f91d4ef72cf69670656422751456cb13aa15b6"} Nov 22 09:33:52 crc kubenswrapper[4846]: I1122 09:33:52.833575 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 09:33:52 crc kubenswrapper[4846]: I1122 09:33:52.836323 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerStarted","Data":"cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0"} Nov 22 09:33:52 crc kubenswrapper[4846]: I1122 09:33:52.883965 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.628830329 podStartE2EDuration="3.883939959s" podCreationTimestamp="2025-11-22 09:33:49 +0000 UTC" firstStartedPulling="2025-11-22 09:33:50.635903849 +0000 UTC m=+1205.571593498" lastFinishedPulling="2025-11-22 09:33:51.891013479 +0000 UTC m=+1206.826703128" observedRunningTime="2025-11-22 09:33:52.876744168 +0000 UTC m=+1207.812433837" watchObservedRunningTime="2025-11-22 09:33:52.883939959 +0000 UTC m=+1207.819629618" Nov 22 09:33:53 crc kubenswrapper[4846]: I1122 09:33:53.854827 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerStarted","Data":"b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77"} Nov 22 09:33:53 crc kubenswrapper[4846]: I1122 09:33:53.855859 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerStarted","Data":"aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb"} Nov 22 09:33:55 crc kubenswrapper[4846]: I1122 09:33:55.877058 4846 generic.go:334] "Generic (PLEG): container finished" podID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerID="896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f" exitCode=1 Nov 22 09:33:55 crc kubenswrapper[4846]: I1122 09:33:55.877088 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerDied","Data":"896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f"} Nov 22 09:33:55 crc kubenswrapper[4846]: I1122 09:33:55.877248 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-central-agent" containerID="cri-o://cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0" gracePeriod=30 Nov 22 09:33:55 crc kubenswrapper[4846]: I1122 09:33:55.877276 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="sg-core" containerID="cri-o://b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77" gracePeriod=30 Nov 22 09:33:55 crc kubenswrapper[4846]: I1122 09:33:55.877308 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-notification-agent" containerID="cri-o://aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb" gracePeriod=30 Nov 22 09:33:56 crc kubenswrapper[4846]: I1122 09:33:56.893835 4846 generic.go:334] "Generic (PLEG): container finished" podID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerID="b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77" exitCode=2 Nov 22 09:33:56 crc kubenswrapper[4846]: I1122 09:33:56.894757 4846 generic.go:334] "Generic (PLEG): container finished" podID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerID="aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb" exitCode=0 Nov 22 09:33:56 crc kubenswrapper[4846]: I1122 09:33:56.893911 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerDied","Data":"b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77"} Nov 22 09:33:56 crc kubenswrapper[4846]: I1122 09:33:56.894810 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerDied","Data":"aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb"} Nov 22 09:33:57 crc kubenswrapper[4846]: I1122 09:33:57.921953 4846 generic.go:334] "Generic (PLEG): container finished" podID="e9154a2c-895f-4ef4-921a-08305d1f8c4f" containerID="e4eff862f8b641426e53f1112558ad1879bdfa977c7fed124e3a5fdf0376720b" exitCode=0 Nov 22 09:33:57 crc kubenswrapper[4846]: I1122 09:33:57.922104 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" event={"ID":"e9154a2c-895f-4ef4-921a-08305d1f8c4f","Type":"ContainerDied","Data":"e4eff862f8b641426e53f1112558ad1879bdfa977c7fed124e3a5fdf0376720b"} Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.310802 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.460817 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-config-data\") pod \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.460976 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-scripts\") pod \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.461091 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xgt\" (UniqueName: \"kubernetes.io/projected/e9154a2c-895f-4ef4-921a-08305d1f8c4f-kube-api-access-p6xgt\") pod \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.461266 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-combined-ca-bundle\") pod \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\" (UID: \"e9154a2c-895f-4ef4-921a-08305d1f8c4f\") " Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.467956 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-scripts" (OuterVolumeSpecName: "scripts") pod "e9154a2c-895f-4ef4-921a-08305d1f8c4f" (UID: "e9154a2c-895f-4ef4-921a-08305d1f8c4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.468462 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9154a2c-895f-4ef4-921a-08305d1f8c4f-kube-api-access-p6xgt" (OuterVolumeSpecName: "kube-api-access-p6xgt") pod "e9154a2c-895f-4ef4-921a-08305d1f8c4f" (UID: "e9154a2c-895f-4ef4-921a-08305d1f8c4f"). InnerVolumeSpecName "kube-api-access-p6xgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.492500 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9154a2c-895f-4ef4-921a-08305d1f8c4f" (UID: "e9154a2c-895f-4ef4-921a-08305d1f8c4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.496021 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-config-data" (OuterVolumeSpecName: "config-data") pod "e9154a2c-895f-4ef4-921a-08305d1f8c4f" (UID: "e9154a2c-895f-4ef4-921a-08305d1f8c4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.564032 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.564447 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.564458 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xgt\" (UniqueName: \"kubernetes.io/projected/e9154a2c-895f-4ef4-921a-08305d1f8c4f-kube-api-access-p6xgt\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.564467 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9154a2c-895f-4ef4-921a-08305d1f8c4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.941288 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" event={"ID":"e9154a2c-895f-4ef4-921a-08305d1f8c4f","Type":"ContainerDied","Data":"b2f66e3d8b29f0c2648593b3ae6aaeed881a0dba08e137dc392f57dfd1678c8d"} Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.941361 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f66e3d8b29f0c2648593b3ae6aaeed881a0dba08e137dc392f57dfd1678c8d" Nov 22 09:33:59 crc kubenswrapper[4846]: I1122 09:33:59.941334 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fzdp" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.092861 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:34:00 crc kubenswrapper[4846]: E1122 09:34:00.093341 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9154a2c-895f-4ef4-921a-08305d1f8c4f" containerName="nova-cell0-conductor-db-sync" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.093361 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9154a2c-895f-4ef4-921a-08305d1f8c4f" containerName="nova-cell0-conductor-db-sync" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.093574 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9154a2c-895f-4ef4-921a-08305d1f8c4f" containerName="nova-cell0-conductor-db-sync" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.098517 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.101300 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pt9t4" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.101303 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.111088 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.185903 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnbr\" (UniqueName: \"kubernetes.io/projected/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-kube-api-access-5jnbr\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.186309 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.187691 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.253377 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.290869 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnbr\" (UniqueName: \"kubernetes.io/projected/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-kube-api-access-5jnbr\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.290965 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.291020 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.297146 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.297819 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.318205 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnbr\" (UniqueName: \"kubernetes.io/projected/525d7ecc-cc33-4162-82f8-bfa33a4b15ed-kube-api-access-5jnbr\") pod \"nova-cell0-conductor-0\" (UID: \"525d7ecc-cc33-4162-82f8-bfa33a4b15ed\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.446159 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:00 crc kubenswrapper[4846]: I1122 09:34:00.951698 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:34:01 crc kubenswrapper[4846]: I1122 09:34:01.965772 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"525d7ecc-cc33-4162-82f8-bfa33a4b15ed","Type":"ContainerStarted","Data":"e77514df21dc00947dd5280aabee03a8712619984d15bf1952ec14b22b385f93"} Nov 22 09:34:01 crc kubenswrapper[4846]: I1122 09:34:01.966262 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"525d7ecc-cc33-4162-82f8-bfa33a4b15ed","Type":"ContainerStarted","Data":"024036ec63d37a0a9f8be8e5ebc740553264698b6312956b00528918e78d8d8e"} Nov 22 09:34:01 crc kubenswrapper[4846]: I1122 09:34:01.967590 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:01 crc kubenswrapper[4846]: I1122 09:34:01.992793 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.992765348 podStartE2EDuration="1.992765348s" podCreationTimestamp="2025-11-22 09:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:01.989476381 +0000 UTC m=+1216.925166030" watchObservedRunningTime="2025-11-22 09:34:01.992765348 +0000 UTC m=+1216.928454997" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.713591 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.852006 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-scripts\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.852964 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqnpc\" (UniqueName: \"kubernetes.io/projected/0fd320e6-1b20-4a5d-890c-91e19f17f29c-kube-api-access-zqnpc\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.853140 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-config-data\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.853379 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-sg-core-conf-yaml\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.853980 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-log-httpd\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.854191 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-run-httpd\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.854339 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-combined-ca-bundle\") pod \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\" (UID: \"0fd320e6-1b20-4a5d-890c-91e19f17f29c\") " Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.854476 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.854578 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.855925 4846 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.856025 4846 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0fd320e6-1b20-4a5d-890c-91e19f17f29c-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.860800 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd320e6-1b20-4a5d-890c-91e19f17f29c-kube-api-access-zqnpc" (OuterVolumeSpecName: "kube-api-access-zqnpc") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "kube-api-access-zqnpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.862003 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-scripts" (OuterVolumeSpecName: "scripts") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.905628 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.933851 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.957651 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.957800 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.957892 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqnpc\" (UniqueName: \"kubernetes.io/projected/0fd320e6-1b20-4a5d-890c-91e19f17f29c-kube-api-access-zqnpc\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.957988 4846 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.974283 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-config-data" (OuterVolumeSpecName: "config-data") pod "0fd320e6-1b20-4a5d-890c-91e19f17f29c" (UID: "0fd320e6-1b20-4a5d-890c-91e19f17f29c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.981239 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerDied","Data":"cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0"} Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.981334 4846 scope.go:117] "RemoveContainer" containerID="896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.981280 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.994190 4846 generic.go:334] "Generic (PLEG): container finished" podID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerID="cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0" exitCode=0 Nov 22 09:34:02 crc kubenswrapper[4846]: I1122 09:34:02.994454 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0fd320e6-1b20-4a5d-890c-91e19f17f29c","Type":"ContainerDied","Data":"576c99669820ab55d3dd3353cfb7894b277fe9869e0f040140ddeaf9458dd9df"} Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.030704 4846 scope.go:117] "RemoveContainer" containerID="b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.051839 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.060446 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd320e6-1b20-4a5d-890c-91e19f17f29c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.066171 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.073150 4846 scope.go:117] "RemoveContainer" containerID="aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.078466 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.079010 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-central-agent" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079035 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-central-agent" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.079077 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-notification-agent" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079087 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-notification-agent" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.079106 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="sg-core" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079117 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="sg-core" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.079149 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="proxy-httpd" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079156 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="proxy-httpd" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079380 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-notification-agent" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079398 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="proxy-httpd" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079412 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="sg-core" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.079426 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" containerName="ceilometer-central-agent" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.081802 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.089917 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.090520 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.090965 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.091134 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.123032 4846 scope.go:117] "RemoveContainer" containerID="cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.149407 4846 scope.go:117] "RemoveContainer" containerID="896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.149966 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f\": container with ID starting with 896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f not found: ID does not exist" containerID="896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.150002 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f"} err="failed to get container status \"896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f\": rpc error: code = NotFound desc = could not find container \"896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f\": container with ID starting with 896226f786ba2456dc55687c10418b8dfde7bfcf8a1737c2e757dd5f95d0e57f not found: ID does not exist" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.150024 4846 scope.go:117] "RemoveContainer" containerID="b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.150455 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77\": container with ID starting with b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77 not found: ID does not exist" containerID="b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.150509 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77"} err="failed to get container status \"b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77\": rpc error: code = NotFound desc = could not find container \"b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77\": container with ID starting with b2a18b1e445472392942f1689b8b9b2d13487d130500fb86ea0ac9c335668c77 not found: ID does not exist" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.150547 4846 scope.go:117] "RemoveContainer" containerID="aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.150837 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb\": container with ID starting with aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb not found: ID does not exist" containerID="aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.150868 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb"} err="failed to get container status \"aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb\": rpc error: code = NotFound desc = could not find container \"aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb\": container with ID starting with aa84eee9ce1db6c5988aa3b1fffc342bcdae0fc2d986bca31170162aae533fbb not found: ID does not exist" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.150883 4846 scope.go:117] "RemoveContainer" containerID="cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0" Nov 22 09:34:03 crc kubenswrapper[4846]: E1122 09:34:03.151337 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0\": container with ID starting with cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0 not found: ID does not exist" containerID="cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.151414 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0"} err="failed to get container status \"cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0\": rpc error: code = NotFound desc = could not find container \"cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0\": container with ID starting with cc323f739a5353acb20ce88389021257dc0efbd3467b6fc1c88d04a6a5c86ea0 not found: ID does not exist" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.162543 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-run-httpd\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.162601 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-scripts\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.162641 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-config-data\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.162861 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-log-httpd\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.162935 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.163255 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz6h\" (UniqueName: \"kubernetes.io/projected/3419e190-c6c9-409c-8c77-0ab4c20dee33-kube-api-access-mnz6h\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.163310 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.163411 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265154 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-run-httpd\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265241 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-scripts\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265464 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-config-data\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265583 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-log-httpd\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265743 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265867 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz6h\" (UniqueName: \"kubernetes.io/projected/3419e190-c6c9-409c-8c77-0ab4c20dee33-kube-api-access-mnz6h\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265913 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.265967 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.267151 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-run-httpd\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.267205 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-log-httpd\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.273197 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.273796 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.274238 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-config-data\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.274283 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-scripts\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.284634 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.287739 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz6h\" (UniqueName: \"kubernetes.io/projected/3419e190-c6c9-409c-8c77-0ab4c20dee33-kube-api-access-mnz6h\") pod \"ceilometer-0\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.420810 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:03 crc kubenswrapper[4846]: I1122 09:34:03.952822 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:04 crc kubenswrapper[4846]: I1122 09:34:04.030151 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerStarted","Data":"5c51789b85f5041792af2c22f760e7f8244962f91776fbc98b70728aabb89efd"} Nov 22 09:34:04 crc kubenswrapper[4846]: I1122 09:34:04.054158 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd320e6-1b20-4a5d-890c-91e19f17f29c" path="/var/lib/kubelet/pods/0fd320e6-1b20-4a5d-890c-91e19f17f29c/volumes" Nov 22 09:34:05 crc kubenswrapper[4846]: I1122 09:34:05.043457 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerStarted","Data":"ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9"} Nov 22 09:34:06 crc kubenswrapper[4846]: I1122 09:34:06.064502 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerStarted","Data":"24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675"} Nov 22 09:34:08 crc kubenswrapper[4846]: I1122 09:34:08.106931 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerStarted","Data":"d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362"} Nov 22 09:34:09 crc kubenswrapper[4846]: I1122 09:34:09.134967 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerStarted","Data":"9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487"} Nov 22 09:34:09 crc kubenswrapper[4846]: I1122 09:34:09.136066 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 09:34:09 crc kubenswrapper[4846]: I1122 09:34:09.194335 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.623909071 podStartE2EDuration="6.194313459s" podCreationTimestamp="2025-11-22 09:34:03 +0000 UTC" firstStartedPulling="2025-11-22 09:34:03.962450633 +0000 UTC m=+1218.898140282" lastFinishedPulling="2025-11-22 09:34:08.532855031 +0000 UTC m=+1223.468544670" observedRunningTime="2025-11-22 09:34:09.187714676 +0000 UTC m=+1224.123404325" watchObservedRunningTime="2025-11-22 09:34:09.194313459 +0000 UTC m=+1224.130003098" Nov 22 09:34:10 crc kubenswrapper[4846]: I1122 09:34:10.478116 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.002472 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4bdbs"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.004037 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.007348 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.011741 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.013716 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4bdbs"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.045527 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.045583 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-config-data\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.045642 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-scripts\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.045693 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89rq\" (UniqueName: \"kubernetes.io/projected/8ece6607-f6f8-4060-9a47-0ccd9560ce96-kube-api-access-f89rq\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.147191 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.147547 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-config-data\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.147627 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-scripts\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.147667 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f89rq\" (UniqueName: \"kubernetes.io/projected/8ece6607-f6f8-4060-9a47-0ccd9560ce96-kube-api-access-f89rq\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.164493 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.164574 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-config-data\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.182337 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-scripts\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.186820 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89rq\" (UniqueName: \"kubernetes.io/projected/8ece6607-f6f8-4060-9a47-0ccd9560ce96-kube-api-access-f89rq\") pod \"nova-cell0-cell-mapping-4bdbs\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.270402 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.272169 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.274854 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.282918 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.338507 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.380113 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.380145 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.380204 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbpm8\" (UniqueName: \"kubernetes.io/projected/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-kube-api-access-vbpm8\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.380291 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-config-data\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.380327 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-logs\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.382293 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.386266 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.421489 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.446736 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.446895 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.451573 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.480130 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.485407 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.485949 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.485998 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxqq\" (UniqueName: \"kubernetes.io/projected/112b685f-4312-457f-8f40-68eb8ca21167-kube-api-access-fvxqq\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.486073 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbpm8\" (UniqueName: \"kubernetes.io/projected/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-kube-api-access-vbpm8\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.486100 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-config-data\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.486174 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-config-data\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.486200 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112b685f-4312-457f-8f40-68eb8ca21167-logs\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.486238 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-logs\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.486725 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-logs\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.492743 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.502716 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-config-data\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.507541 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.508965 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.515325 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.537465 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbpm8\" (UniqueName: \"kubernetes.io/projected/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-kube-api-access-vbpm8\") pod \"nova-api-0\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591132 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591177 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591209 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-config-data\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591234 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5wlj\" (UniqueName: \"kubernetes.io/projected/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-kube-api-access-r5wlj\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591261 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxqq\" (UniqueName: \"kubernetes.io/projected/112b685f-4312-457f-8f40-68eb8ca21167-kube-api-access-fvxqq\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591290 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qfd\" (UniqueName: \"kubernetes.io/projected/70c528eb-ae77-41b9-8a75-b724051d88bb-kube-api-access-h4qfd\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591319 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-config-data\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591355 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591412 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112b685f-4312-457f-8f40-68eb8ca21167-logs\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.591459 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.597683 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112b685f-4312-457f-8f40-68eb8ca21167-logs\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.599731 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.603028 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-config-data\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.618250 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.640319 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.643335 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxqq\" (UniqueName: \"kubernetes.io/projected/112b685f-4312-457f-8f40-68eb8ca21167-kube-api-access-fvxqq\") pod \"nova-metadata-0\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.718688 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ljjss"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.724739 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.725502 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.725531 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-config-data\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.725568 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5wlj\" (UniqueName: \"kubernetes.io/projected/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-kube-api-access-r5wlj\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.725617 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qfd\" (UniqueName: \"kubernetes.io/projected/70c528eb-ae77-41b9-8a75-b724051d88bb-kube-api-access-h4qfd\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.725705 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.729475 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.743299 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.746720 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qfd\" (UniqueName: \"kubernetes.io/projected/70c528eb-ae77-41b9-8a75-b724051d88bb-kube-api-access-h4qfd\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.749417 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5wlj\" (UniqueName: \"kubernetes.io/projected/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-kube-api-access-r5wlj\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.749834 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.750596 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.750681 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-config-data\") pod \"nova-scheduler-0\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.760418 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ljjss"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.788952 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.829679 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.829727 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.829881 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-config\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.829969 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7hcn\" (UniqueName: \"kubernetes.io/projected/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-kube-api-access-r7hcn\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.830038 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.830113 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.937577 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7hcn\" (UniqueName: \"kubernetes.io/projected/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-kube-api-access-r7hcn\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.937643 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.937687 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.937723 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.937740 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.937835 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-config\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.938937 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-config\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.939916 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.940469 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.941095 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.941687 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.959844 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.978827 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7hcn\" (UniqueName: \"kubernetes.io/projected/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-kube-api-access-r7hcn\") pod \"dnsmasq-dns-757b4f8459-ljjss\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.982098 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4bdbs"] Nov 22 09:34:11 crc kubenswrapper[4846]: I1122 09:34:11.999442 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:12 crc kubenswrapper[4846]: I1122 09:34:12.082686 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:12 crc kubenswrapper[4846]: I1122 09:34:12.244002 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4bdbs" event={"ID":"8ece6607-f6f8-4060-9a47-0ccd9560ce96","Type":"ContainerStarted","Data":"4f308218cd6c053dd8cb621fde0e44491f3d639ece00b0935686d43653dffaa0"} Nov 22 09:34:12 crc kubenswrapper[4846]: I1122 09:34:12.424514 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:12 crc kubenswrapper[4846]: W1122 09:34:12.835456 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112b685f_4312_457f_8f40_68eb8ca21167.slice/crio-3a575539f5cbe5175f8d49645bbe2cb9de0724c543332fe4fe66662745412e99 WatchSource:0}: Error finding container 3a575539f5cbe5175f8d49645bbe2cb9de0724c543332fe4fe66662745412e99: Status 404 returned error can't find the container with id 3a575539f5cbe5175f8d49645bbe2cb9de0724c543332fe4fe66662745412e99 Nov 22 09:34:12 crc kubenswrapper[4846]: I1122 09:34:12.840822 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:12 crc kubenswrapper[4846]: I1122 09:34:12.987210 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.025694 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.063979 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fmhx6"] Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.071303 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.095229 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.096072 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.113175 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fmhx6"] Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.192668 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ljjss"] Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.207672 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.207828 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2rzv\" (UniqueName: \"kubernetes.io/projected/86cc211f-34a7-4337-9560-1b30aae9b177-kube-api-access-m2rzv\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.207925 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-config-data\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.207969 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-scripts\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.265123 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"112b685f-4312-457f-8f40-68eb8ca21167","Type":"ContainerStarted","Data":"3a575539f5cbe5175f8d49645bbe2cb9de0724c543332fe4fe66662745412e99"} Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.276300 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e9b69b9-98ae-4e5c-bc88-b960a615ee03","Type":"ContainerStarted","Data":"fc87464ea66c04c8540c22746cd653d7aca1ede0679e022fda621348835c7957"} Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.286404 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70c528eb-ae77-41b9-8a75-b724051d88bb","Type":"ContainerStarted","Data":"f89fd2ebfbd87d0bb5a06974723c7bd99633c9cf21e39dace556c416ef7f99fa"} Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.292908 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" event={"ID":"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c","Type":"ContainerStarted","Data":"3accb8a8d3190d6009e25b5d976fde078453ddf8a0b1ce2565278f24e5c3bf6f"} Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.301257 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7df46c0-c0b8-401d-95ba-5b42afc7b06c","Type":"ContainerStarted","Data":"b17051b2f886cbc058dbb1151cb4f61f0aa0d6995133d538d765486a1ac74222"} Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.309588 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.309692 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2rzv\" (UniqueName: \"kubernetes.io/projected/86cc211f-34a7-4337-9560-1b30aae9b177-kube-api-access-m2rzv\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.309781 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-config-data\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.309818 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-scripts\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.312973 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4bdbs" event={"ID":"8ece6607-f6f8-4060-9a47-0ccd9560ce96","Type":"ContainerStarted","Data":"332780e64e6f2a94d91f66052b9db84174170d0c9a16946788c4eda32334cf68"} Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.314177 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-scripts\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.321962 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.323958 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-config-data\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.327462 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2rzv\" (UniqueName: \"kubernetes.io/projected/86cc211f-34a7-4337-9560-1b30aae9b177-kube-api-access-m2rzv\") pod \"nova-cell1-conductor-db-sync-fmhx6\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.344535 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4bdbs" podStartSLOduration=3.344507706 podStartE2EDuration="3.344507706s" podCreationTimestamp="2025-11-22 09:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:13.330881097 +0000 UTC m=+1228.266570746" watchObservedRunningTime="2025-11-22 09:34:13.344507706 +0000 UTC m=+1228.280197355" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.417197 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:13 crc kubenswrapper[4846]: I1122 09:34:13.739610 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fmhx6"] Nov 22 09:34:14 crc kubenswrapper[4846]: I1122 09:34:14.329187 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" event={"ID":"86cc211f-34a7-4337-9560-1b30aae9b177","Type":"ContainerStarted","Data":"e318969a15becfc0ebbc3f5637565e5a61c752953d42935e2586fb34b690e022"} Nov 22 09:34:14 crc kubenswrapper[4846]: I1122 09:34:14.329752 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" event={"ID":"86cc211f-34a7-4337-9560-1b30aae9b177","Type":"ContainerStarted","Data":"97901f8f715204b5cd055711de228a0ad5b790321f850d873d43b0a268c459ab"} Nov 22 09:34:14 crc kubenswrapper[4846]: I1122 09:34:14.336359 4846 generic.go:334] "Generic (PLEG): container finished" podID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerID="77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a" exitCode=0 Nov 22 09:34:14 crc kubenswrapper[4846]: I1122 09:34:14.336482 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" event={"ID":"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c","Type":"ContainerDied","Data":"77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a"} Nov 22 09:34:14 crc kubenswrapper[4846]: I1122 09:34:14.356769 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" podStartSLOduration=1.356724671 podStartE2EDuration="1.356724671s" podCreationTimestamp="2025-11-22 09:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:14.352643351 +0000 UTC m=+1229.288333000" watchObservedRunningTime="2025-11-22 09:34:14.356724671 +0000 UTC m=+1229.292414320" Nov 22 09:34:15 crc kubenswrapper[4846]: I1122 09:34:15.033528 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:15 crc kubenswrapper[4846]: I1122 09:34:15.071027 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:15 crc kubenswrapper[4846]: I1122 09:34:15.355264 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" event={"ID":"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c","Type":"ContainerStarted","Data":"60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374"} Nov 22 09:34:15 crc kubenswrapper[4846]: I1122 09:34:15.356340 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:15 crc kubenswrapper[4846]: I1122 09:34:15.388245 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" podStartSLOduration=4.38821748 podStartE2EDuration="4.38821748s" podCreationTimestamp="2025-11-22 09:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:15.383150342 +0000 UTC m=+1230.318839991" watchObservedRunningTime="2025-11-22 09:34:15.38821748 +0000 UTC m=+1230.323907129" Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.395485 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70c528eb-ae77-41b9-8a75-b724051d88bb","Type":"ContainerStarted","Data":"4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a"} Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.415926 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"112b685f-4312-457f-8f40-68eb8ca21167","Type":"ContainerStarted","Data":"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b"} Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.416011 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"112b685f-4312-457f-8f40-68eb8ca21167","Type":"ContainerStarted","Data":"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a"} Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.416237 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-log" containerID="cri-o://d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a" gracePeriod=30 Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.416920 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-metadata" containerID="cri-o://2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b" gracePeriod=30 Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.419924 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.191041809 podStartE2EDuration="7.419893863s" podCreationTimestamp="2025-11-22 09:34:11 +0000 UTC" firstStartedPulling="2025-11-22 09:34:13.000035235 +0000 UTC m=+1227.935724904" lastFinishedPulling="2025-11-22 09:34:17.228887309 +0000 UTC m=+1232.164576958" observedRunningTime="2025-11-22 09:34:18.412543529 +0000 UTC m=+1233.348233188" watchObservedRunningTime="2025-11-22 09:34:18.419893863 +0000 UTC m=+1233.355583512" Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.427528 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e9b69b9-98ae-4e5c-bc88-b960a615ee03","Type":"ContainerStarted","Data":"715dd1f0953d1afce0f977703f9fc9def739067ea07ede9777ba184f315bbf44"} Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.427650 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e9b69b9-98ae-4e5c-bc88-b960a615ee03","Type":"ContainerStarted","Data":"6fc1e247eaf5d86a1288a1c211b291a229e4caf17809f8ceee1ccf91eeaaaa79"} Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.430657 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7df46c0-c0b8-401d-95ba-5b42afc7b06c","Type":"ContainerStarted","Data":"5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82"} Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.431130 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f7df46c0-c0b8-401d-95ba-5b42afc7b06c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82" gracePeriod=30 Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.487826 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.095718928 podStartE2EDuration="7.487792524s" podCreationTimestamp="2025-11-22 09:34:11 +0000 UTC" firstStartedPulling="2025-11-22 09:34:12.838063519 +0000 UTC m=+1227.773753168" lastFinishedPulling="2025-11-22 09:34:17.230137115 +0000 UTC m=+1232.165826764" observedRunningTime="2025-11-22 09:34:18.450987396 +0000 UTC m=+1233.386677045" watchObservedRunningTime="2025-11-22 09:34:18.487792524 +0000 UTC m=+1233.423482173" Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.494019 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.767233536 podStartE2EDuration="7.494000785s" podCreationTimestamp="2025-11-22 09:34:11 +0000 UTC" firstStartedPulling="2025-11-22 09:34:12.511708198 +0000 UTC m=+1227.447397847" lastFinishedPulling="2025-11-22 09:34:17.238475447 +0000 UTC m=+1232.174165096" observedRunningTime="2025-11-22 09:34:18.476678001 +0000 UTC m=+1233.412367650" watchObservedRunningTime="2025-11-22 09:34:18.494000785 +0000 UTC m=+1233.429690444" Nov 22 09:34:18 crc kubenswrapper[4846]: I1122 09:34:18.515886 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.345827249 podStartE2EDuration="7.515854359s" podCreationTimestamp="2025-11-22 09:34:11 +0000 UTC" firstStartedPulling="2025-11-22 09:34:13.05959657 +0000 UTC m=+1227.995286219" lastFinishedPulling="2025-11-22 09:34:17.22962368 +0000 UTC m=+1232.165313329" observedRunningTime="2025-11-22 09:34:18.503697026 +0000 UTC m=+1233.439386675" watchObservedRunningTime="2025-11-22 09:34:18.515854359 +0000 UTC m=+1233.451544008" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.426302 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.442105 4846 generic.go:334] "Generic (PLEG): container finished" podID="112b685f-4312-457f-8f40-68eb8ca21167" containerID="2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b" exitCode=0 Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.442138 4846 generic.go:334] "Generic (PLEG): container finished" podID="112b685f-4312-457f-8f40-68eb8ca21167" containerID="d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a" exitCode=143 Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.443212 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.443373 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"112b685f-4312-457f-8f40-68eb8ca21167","Type":"ContainerDied","Data":"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b"} Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.443407 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"112b685f-4312-457f-8f40-68eb8ca21167","Type":"ContainerDied","Data":"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a"} Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.443419 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"112b685f-4312-457f-8f40-68eb8ca21167","Type":"ContainerDied","Data":"3a575539f5cbe5175f8d49645bbe2cb9de0724c543332fe4fe66662745412e99"} Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.443438 4846 scope.go:117] "RemoveContainer" containerID="2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.475394 4846 scope.go:117] "RemoveContainer" containerID="d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.499402 4846 scope.go:117] "RemoveContainer" containerID="2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b" Nov 22 09:34:19 crc kubenswrapper[4846]: E1122 09:34:19.499925 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b\": container with ID starting with 2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b not found: ID does not exist" containerID="2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.499971 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b"} err="failed to get container status \"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b\": rpc error: code = NotFound desc = could not find container \"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b\": container with ID starting with 2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b not found: ID does not exist" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.500005 4846 scope.go:117] "RemoveContainer" containerID="d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a" Nov 22 09:34:19 crc kubenswrapper[4846]: E1122 09:34:19.500404 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a\": container with ID starting with d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a not found: ID does not exist" containerID="d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.500440 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a"} err="failed to get container status \"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a\": rpc error: code = NotFound desc = could not find container \"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a\": container with ID starting with d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a not found: ID does not exist" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.500474 4846 scope.go:117] "RemoveContainer" containerID="2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.500888 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b"} err="failed to get container status \"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b\": rpc error: code = NotFound desc = could not find container \"2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b\": container with ID starting with 2b4b084285ca5bb340c39f0f67552e13447530e95d8d6e276bf757d48d055b7b not found: ID does not exist" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.500927 4846 scope.go:117] "RemoveContainer" containerID="d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.501300 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a"} err="failed to get container status \"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a\": rpc error: code = NotFound desc = could not find container \"d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a\": container with ID starting with d00b9dbdd710c06d8b5ef393d072ec20f64417b882d36ae5961be1b94fbad91a not found: ID does not exist" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.589215 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-config-data\") pod \"112b685f-4312-457f-8f40-68eb8ca21167\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.589611 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112b685f-4312-457f-8f40-68eb8ca21167-logs\") pod \"112b685f-4312-457f-8f40-68eb8ca21167\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.589711 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-combined-ca-bundle\") pod \"112b685f-4312-457f-8f40-68eb8ca21167\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.590171 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvxqq\" (UniqueName: \"kubernetes.io/projected/112b685f-4312-457f-8f40-68eb8ca21167-kube-api-access-fvxqq\") pod \"112b685f-4312-457f-8f40-68eb8ca21167\" (UID: \"112b685f-4312-457f-8f40-68eb8ca21167\") " Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.590277 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112b685f-4312-457f-8f40-68eb8ca21167-logs" (OuterVolumeSpecName: "logs") pod "112b685f-4312-457f-8f40-68eb8ca21167" (UID: "112b685f-4312-457f-8f40-68eb8ca21167"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.591743 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112b685f-4312-457f-8f40-68eb8ca21167-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.597987 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112b685f-4312-457f-8f40-68eb8ca21167-kube-api-access-fvxqq" (OuterVolumeSpecName: "kube-api-access-fvxqq") pod "112b685f-4312-457f-8f40-68eb8ca21167" (UID: "112b685f-4312-457f-8f40-68eb8ca21167"). InnerVolumeSpecName "kube-api-access-fvxqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.628387 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-config-data" (OuterVolumeSpecName: "config-data") pod "112b685f-4312-457f-8f40-68eb8ca21167" (UID: "112b685f-4312-457f-8f40-68eb8ca21167"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.635739 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "112b685f-4312-457f-8f40-68eb8ca21167" (UID: "112b685f-4312-457f-8f40-68eb8ca21167"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.693809 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.693845 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112b685f-4312-457f-8f40-68eb8ca21167-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.693856 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvxqq\" (UniqueName: \"kubernetes.io/projected/112b685f-4312-457f-8f40-68eb8ca21167-kube-api-access-fvxqq\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.779722 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.790976 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.810268 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:19 crc kubenswrapper[4846]: E1122 09:34:19.810757 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-metadata" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.810779 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-metadata" Nov 22 09:34:19 crc kubenswrapper[4846]: E1122 09:34:19.810838 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-log" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.810847 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-log" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.811124 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-metadata" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.811142 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="112b685f-4312-457f-8f40-68eb8ca21167" containerName="nova-metadata-log" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.812413 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.827490 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.828006 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:34:19 crc kubenswrapper[4846]: I1122 09:34:19.832348 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.001635 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhhj\" (UniqueName: \"kubernetes.io/projected/77792561-32ce-4a62-8f39-2d273ccde671-kube-api-access-dbhhj\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.001698 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-config-data\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.001991 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.002478 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.002663 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77792561-32ce-4a62-8f39-2d273ccde671-logs\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.067326 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112b685f-4312-457f-8f40-68eb8ca21167" path="/var/lib/kubelet/pods/112b685f-4312-457f-8f40-68eb8ca21167/volumes" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.104914 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.105041 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.105105 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77792561-32ce-4a62-8f39-2d273ccde671-logs\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.105175 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhhj\" (UniqueName: \"kubernetes.io/projected/77792561-32ce-4a62-8f39-2d273ccde671-kube-api-access-dbhhj\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.105206 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-config-data\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.105713 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77792561-32ce-4a62-8f39-2d273ccde671-logs\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.109988 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.113768 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-config-data\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.119661 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.126510 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhhj\" (UniqueName: \"kubernetes.io/projected/77792561-32ce-4a62-8f39-2d273ccde671-kube-api-access-dbhhj\") pod \"nova-metadata-0\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.133599 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:20 crc kubenswrapper[4846]: I1122 09:34:20.665617 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.474685 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77792561-32ce-4a62-8f39-2d273ccde671","Type":"ContainerStarted","Data":"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec"} Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.475103 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77792561-32ce-4a62-8f39-2d273ccde671","Type":"ContainerStarted","Data":"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258"} Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.475119 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77792561-32ce-4a62-8f39-2d273ccde671","Type":"ContainerStarted","Data":"55e11659cc84968116cb83318b877f841ebfb555908f0c3f2042eb4530c17354"} Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.500255 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.500231107 podStartE2EDuration="2.500231107s" podCreationTimestamp="2025-11-22 09:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:21.499098894 +0000 UTC m=+1236.434788543" watchObservedRunningTime="2025-11-22 09:34:21.500231107 +0000 UTC m=+1236.435920766" Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.619187 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.619244 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.961147 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.961861 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:34:21 crc kubenswrapper[4846]: I1122 09:34:21.990224 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.000086 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.085203 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.235597 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fpp2t"] Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.236349 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerName="dnsmasq-dns" containerID="cri-o://56769491f945c967a64c85ac8dc8417e4e0cf02ef727dad109906b64c5a332dd" gracePeriod=10 Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.516648 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerID="56769491f945c967a64c85ac8dc8417e4e0cf02ef727dad109906b64c5a332dd" exitCode=0 Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.517431 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" event={"ID":"ee773f3f-4677-4ceb-957f-a7c1743688a3","Type":"ContainerDied","Data":"56769491f945c967a64c85ac8dc8417e4e0cf02ef727dad109906b64c5a332dd"} Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.537856 4846 generic.go:334] "Generic (PLEG): container finished" podID="8ece6607-f6f8-4060-9a47-0ccd9560ce96" containerID="332780e64e6f2a94d91f66052b9db84174170d0c9a16946788c4eda32334cf68" exitCode=0 Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.538295 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4bdbs" event={"ID":"8ece6607-f6f8-4060-9a47-0ccd9560ce96","Type":"ContainerDied","Data":"332780e64e6f2a94d91f66052b9db84174170d0c9a16946788c4eda32334cf68"} Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.674667 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.705342 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:34:22 crc kubenswrapper[4846]: I1122 09:34:22.706748 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.036219 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.180978 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-svc\") pod \"ee773f3f-4677-4ceb-957f-a7c1743688a3\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.181334 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-config\") pod \"ee773f3f-4677-4ceb-957f-a7c1743688a3\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.181375 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-nb\") pod \"ee773f3f-4677-4ceb-957f-a7c1743688a3\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.181392 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-sb\") pod \"ee773f3f-4677-4ceb-957f-a7c1743688a3\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.181495 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-swift-storage-0\") pod \"ee773f3f-4677-4ceb-957f-a7c1743688a3\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.181659 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqq4d\" (UniqueName: \"kubernetes.io/projected/ee773f3f-4677-4ceb-957f-a7c1743688a3-kube-api-access-dqq4d\") pod \"ee773f3f-4677-4ceb-957f-a7c1743688a3\" (UID: \"ee773f3f-4677-4ceb-957f-a7c1743688a3\") " Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.196303 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee773f3f-4677-4ceb-957f-a7c1743688a3-kube-api-access-dqq4d" (OuterVolumeSpecName: "kube-api-access-dqq4d") pod "ee773f3f-4677-4ceb-957f-a7c1743688a3" (UID: "ee773f3f-4677-4ceb-957f-a7c1743688a3"). InnerVolumeSpecName "kube-api-access-dqq4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.241473 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee773f3f-4677-4ceb-957f-a7c1743688a3" (UID: "ee773f3f-4677-4ceb-957f-a7c1743688a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.247648 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee773f3f-4677-4ceb-957f-a7c1743688a3" (UID: "ee773f3f-4677-4ceb-957f-a7c1743688a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.247943 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee773f3f-4677-4ceb-957f-a7c1743688a3" (UID: "ee773f3f-4677-4ceb-957f-a7c1743688a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.248981 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee773f3f-4677-4ceb-957f-a7c1743688a3" (UID: "ee773f3f-4677-4ceb-957f-a7c1743688a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.266070 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-config" (OuterVolumeSpecName: "config") pod "ee773f3f-4677-4ceb-957f-a7c1743688a3" (UID: "ee773f3f-4677-4ceb-957f-a7c1743688a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.283830 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.283867 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.283881 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.283895 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqq4d\" (UniqueName: \"kubernetes.io/projected/ee773f3f-4677-4ceb-957f-a7c1743688a3-kube-api-access-dqq4d\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.283909 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.284021 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee773f3f-4677-4ceb-957f-a7c1743688a3-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.549796 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" event={"ID":"ee773f3f-4677-4ceb-957f-a7c1743688a3","Type":"ContainerDied","Data":"b55984033fd72ca94b8d4e6dca48704e261e1053dd768cb1aab3c6a47e5cb424"} Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.549840 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fpp2t" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.549877 4846 scope.go:117] "RemoveContainer" containerID="56769491f945c967a64c85ac8dc8417e4e0cf02ef727dad109906b64c5a332dd" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.553034 4846 generic.go:334] "Generic (PLEG): container finished" podID="86cc211f-34a7-4337-9560-1b30aae9b177" containerID="e318969a15becfc0ebbc3f5637565e5a61c752953d42935e2586fb34b690e022" exitCode=0 Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.553897 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" event={"ID":"86cc211f-34a7-4337-9560-1b30aae9b177","Type":"ContainerDied","Data":"e318969a15becfc0ebbc3f5637565e5a61c752953d42935e2586fb34b690e022"} Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.591175 4846 scope.go:117] "RemoveContainer" containerID="9cb7591cb1576a301fae105d58e02ce4eeb6a551e1cb580f61cb55e2d62a74e2" Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.605916 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fpp2t"] Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.659756 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fpp2t"] Nov 22 09:34:23 crc kubenswrapper[4846]: I1122 09:34:23.971586 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.057166 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" path="/var/lib/kubelet/pods/ee773f3f-4677-4ceb-957f-a7c1743688a3/volumes" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.099960 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-combined-ca-bundle\") pod \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.100618 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-config-data\") pod \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.100827 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-scripts\") pod \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.101082 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f89rq\" (UniqueName: \"kubernetes.io/projected/8ece6607-f6f8-4060-9a47-0ccd9560ce96-kube-api-access-f89rq\") pod \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\" (UID: \"8ece6607-f6f8-4060-9a47-0ccd9560ce96\") " Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.106635 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-scripts" (OuterVolumeSpecName: "scripts") pod "8ece6607-f6f8-4060-9a47-0ccd9560ce96" (UID: "8ece6607-f6f8-4060-9a47-0ccd9560ce96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.107097 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ece6607-f6f8-4060-9a47-0ccd9560ce96-kube-api-access-f89rq" (OuterVolumeSpecName: "kube-api-access-f89rq") pod "8ece6607-f6f8-4060-9a47-0ccd9560ce96" (UID: "8ece6607-f6f8-4060-9a47-0ccd9560ce96"). InnerVolumeSpecName "kube-api-access-f89rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.134866 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ece6607-f6f8-4060-9a47-0ccd9560ce96" (UID: "8ece6607-f6f8-4060-9a47-0ccd9560ce96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.143858 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-config-data" (OuterVolumeSpecName: "config-data") pod "8ece6607-f6f8-4060-9a47-0ccd9560ce96" (UID: "8ece6607-f6f8-4060-9a47-0ccd9560ce96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.206220 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f89rq\" (UniqueName: \"kubernetes.io/projected/8ece6607-f6f8-4060-9a47-0ccd9560ce96-kube-api-access-f89rq\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.206259 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.206270 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.206280 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ece6607-f6f8-4060-9a47-0ccd9560ce96-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.580680 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4bdbs" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.580683 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4bdbs" event={"ID":"8ece6607-f6f8-4060-9a47-0ccd9560ce96","Type":"ContainerDied","Data":"4f308218cd6c053dd8cb621fde0e44491f3d639ece00b0935686d43653dffaa0"} Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.581365 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f308218cd6c053dd8cb621fde0e44491f3d639ece00b0935686d43653dffaa0" Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.807174 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.807468 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-log" containerID="cri-o://6fc1e247eaf5d86a1288a1c211b291a229e4caf17809f8ceee1ccf91eeaaaa79" gracePeriod=30 Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.808082 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-api" containerID="cri-o://715dd1f0953d1afce0f977703f9fc9def739067ea07ede9777ba184f315bbf44" gracePeriod=30 Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.841938 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.842835 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="70c528eb-ae77-41b9-8a75-b724051d88bb" containerName="nova-scheduler-scheduler" containerID="cri-o://4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a" gracePeriod=30 Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.866242 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.866563 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-log" containerID="cri-o://1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258" gracePeriod=30 Nov 22 09:34:24 crc kubenswrapper[4846]: I1122 09:34:24.866752 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-metadata" containerID="cri-o://f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec" gracePeriod=30 Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.133801 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.134321 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.151116 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.235706 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-scripts\") pod \"86cc211f-34a7-4337-9560-1b30aae9b177\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.235798 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2rzv\" (UniqueName: \"kubernetes.io/projected/86cc211f-34a7-4337-9560-1b30aae9b177-kube-api-access-m2rzv\") pod \"86cc211f-34a7-4337-9560-1b30aae9b177\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.235841 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-combined-ca-bundle\") pod \"86cc211f-34a7-4337-9560-1b30aae9b177\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.236104 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-config-data\") pod \"86cc211f-34a7-4337-9560-1b30aae9b177\" (UID: \"86cc211f-34a7-4337-9560-1b30aae9b177\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.245567 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-scripts" (OuterVolumeSpecName: "scripts") pod "86cc211f-34a7-4337-9560-1b30aae9b177" (UID: "86cc211f-34a7-4337-9560-1b30aae9b177"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.246407 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86cc211f-34a7-4337-9560-1b30aae9b177-kube-api-access-m2rzv" (OuterVolumeSpecName: "kube-api-access-m2rzv") pod "86cc211f-34a7-4337-9560-1b30aae9b177" (UID: "86cc211f-34a7-4337-9560-1b30aae9b177"). InnerVolumeSpecName "kube-api-access-m2rzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.284251 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-config-data" (OuterVolumeSpecName: "config-data") pod "86cc211f-34a7-4337-9560-1b30aae9b177" (UID: "86cc211f-34a7-4337-9560-1b30aae9b177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.290591 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86cc211f-34a7-4337-9560-1b30aae9b177" (UID: "86cc211f-34a7-4337-9560-1b30aae9b177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.340274 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.340314 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2rzv\" (UniqueName: \"kubernetes.io/projected/86cc211f-34a7-4337-9560-1b30aae9b177-kube-api-access-m2rzv\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.340331 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.340345 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86cc211f-34a7-4337-9560-1b30aae9b177-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.396384 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.543136 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-combined-ca-bundle\") pod \"77792561-32ce-4a62-8f39-2d273ccde671\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.543728 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-config-data\") pod \"77792561-32ce-4a62-8f39-2d273ccde671\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.543839 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhhj\" (UniqueName: \"kubernetes.io/projected/77792561-32ce-4a62-8f39-2d273ccde671-kube-api-access-dbhhj\") pod \"77792561-32ce-4a62-8f39-2d273ccde671\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.543915 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77792561-32ce-4a62-8f39-2d273ccde671-logs\") pod \"77792561-32ce-4a62-8f39-2d273ccde671\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.544001 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-nova-metadata-tls-certs\") pod \"77792561-32ce-4a62-8f39-2d273ccde671\" (UID: \"77792561-32ce-4a62-8f39-2d273ccde671\") " Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.544264 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77792561-32ce-4a62-8f39-2d273ccde671-logs" (OuterVolumeSpecName: "logs") pod "77792561-32ce-4a62-8f39-2d273ccde671" (UID: "77792561-32ce-4a62-8f39-2d273ccde671"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.544907 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77792561-32ce-4a62-8f39-2d273ccde671-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.548843 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77792561-32ce-4a62-8f39-2d273ccde671-kube-api-access-dbhhj" (OuterVolumeSpecName: "kube-api-access-dbhhj") pod "77792561-32ce-4a62-8f39-2d273ccde671" (UID: "77792561-32ce-4a62-8f39-2d273ccde671"). InnerVolumeSpecName "kube-api-access-dbhhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.571770 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-config-data" (OuterVolumeSpecName: "config-data") pod "77792561-32ce-4a62-8f39-2d273ccde671" (UID: "77792561-32ce-4a62-8f39-2d273ccde671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.579216 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77792561-32ce-4a62-8f39-2d273ccde671" (UID: "77792561-32ce-4a62-8f39-2d273ccde671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593663 4846 generic.go:334] "Generic (PLEG): container finished" podID="77792561-32ce-4a62-8f39-2d273ccde671" containerID="f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec" exitCode=0 Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593701 4846 generic.go:334] "Generic (PLEG): container finished" podID="77792561-32ce-4a62-8f39-2d273ccde671" containerID="1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258" exitCode=143 Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593746 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77792561-32ce-4a62-8f39-2d273ccde671","Type":"ContainerDied","Data":"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec"} Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593773 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77792561-32ce-4a62-8f39-2d273ccde671","Type":"ContainerDied","Data":"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258"} Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593784 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77792561-32ce-4a62-8f39-2d273ccde671","Type":"ContainerDied","Data":"55e11659cc84968116cb83318b877f841ebfb555908f0c3f2042eb4530c17354"} Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593800 4846 scope.go:117] "RemoveContainer" containerID="f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.593923 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.597059 4846 generic.go:334] "Generic (PLEG): container finished" podID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerID="6fc1e247eaf5d86a1288a1c211b291a229e4caf17809f8ceee1ccf91eeaaaa79" exitCode=143 Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.597143 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e9b69b9-98ae-4e5c-bc88-b960a615ee03","Type":"ContainerDied","Data":"6fc1e247eaf5d86a1288a1c211b291a229e4caf17809f8ceee1ccf91eeaaaa79"} Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.599633 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" event={"ID":"86cc211f-34a7-4337-9560-1b30aae9b177","Type":"ContainerDied","Data":"97901f8f715204b5cd055711de228a0ad5b790321f850d873d43b0a268c459ab"} Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.599662 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97901f8f715204b5cd055711de228a0ad5b790321f850d873d43b0a268c459ab" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.599847 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fmhx6" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.620123 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "77792561-32ce-4a62-8f39-2d273ccde671" (UID: "77792561-32ce-4a62-8f39-2d273ccde671"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.631321 4846 scope.go:117] "RemoveContainer" containerID="1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.652413 4846 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.652792 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.652872 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77792561-32ce-4a62-8f39-2d273ccde671-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.652956 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbhhj\" (UniqueName: \"kubernetes.io/projected/77792561-32ce-4a62-8f39-2d273ccde671-kube-api-access-dbhhj\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.667840 4846 scope.go:117] "RemoveContainer" containerID="f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.668730 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec\": container with ID starting with f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec not found: ID does not exist" containerID="f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.668869 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec"} err="failed to get container status \"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec\": rpc error: code = NotFound desc = could not find container \"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec\": container with ID starting with f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec not found: ID does not exist" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.668966 4846 scope.go:117] "RemoveContainer" containerID="1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.669327 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258\": container with ID starting with 1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258 not found: ID does not exist" containerID="1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.669376 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258"} err="failed to get container status \"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258\": rpc error: code = NotFound desc = could not find container \"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258\": container with ID starting with 1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258 not found: ID does not exist" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.669409 4846 scope.go:117] "RemoveContainer" containerID="f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.669678 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec"} err="failed to get container status \"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec\": rpc error: code = NotFound desc = could not find container \"f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec\": container with ID starting with f8f29cada7c039422b46979b2364ef6ca3c6db8a5a3c3b81da8c1523636b70ec not found: ID does not exist" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.669701 4846 scope.go:117] "RemoveContainer" containerID="1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.669888 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258"} err="failed to get container status \"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258\": rpc error: code = NotFound desc = could not find container \"1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258\": container with ID starting with 1eeaf7fa37fe07cac1fc39b8c2b255e8ec8a337df263f93aa48272240c057258 not found: ID does not exist" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673331 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.673827 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerName="dnsmasq-dns" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673844 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerName="dnsmasq-dns" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.673866 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86cc211f-34a7-4337-9560-1b30aae9b177" containerName="nova-cell1-conductor-db-sync" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673873 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cc211f-34a7-4337-9560-1b30aae9b177" containerName="nova-cell1-conductor-db-sync" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.673902 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-metadata" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673908 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-metadata" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.673917 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ece6607-f6f8-4060-9a47-0ccd9560ce96" containerName="nova-manage" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673923 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ece6607-f6f8-4060-9a47-0ccd9560ce96" containerName="nova-manage" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.673936 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerName="init" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673942 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerName="init" Nov 22 09:34:25 crc kubenswrapper[4846]: E1122 09:34:25.673955 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-log" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.673961 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-log" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.674181 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee773f3f-4677-4ceb-957f-a7c1743688a3" containerName="dnsmasq-dns" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.674212 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ece6607-f6f8-4060-9a47-0ccd9560ce96" containerName="nova-manage" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.674227 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-log" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.674236 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="77792561-32ce-4a62-8f39-2d273ccde671" containerName="nova-metadata-metadata" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.674254 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="86cc211f-34a7-4337-9560-1b30aae9b177" containerName="nova-cell1-conductor-db-sync" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.678166 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.684650 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.689496 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.856148 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.856225 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvr5m\" (UniqueName: \"kubernetes.io/projected/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-kube-api-access-hvr5m\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.856396 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.926915 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.934831 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.948283 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.950377 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.957667 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.957768 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.957799 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvr5m\" (UniqueName: \"kubernetes.io/projected/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-kube-api-access-hvr5m\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.960792 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.962072 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.964486 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.968283 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.973453 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:25 crc kubenswrapper[4846]: I1122 09:34:25.985354 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvr5m\" (UniqueName: \"kubernetes.io/projected/7b0d3ce0-49e4-4e73-b2e9-ce405a023987-kube-api-access-hvr5m\") pod \"nova-cell1-conductor-0\" (UID: \"7b0d3ce0-49e4-4e73-b2e9-ce405a023987\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.014105 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.059606 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.060147 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.060312 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-logs\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.060369 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-config-data\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.060793 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rvb\" (UniqueName: \"kubernetes.io/projected/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-kube-api-access-45rvb\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.064409 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77792561-32ce-4a62-8f39-2d273ccde671" path="/var/lib/kubelet/pods/77792561-32ce-4a62-8f39-2d273ccde671/volumes" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.163299 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rvb\" (UniqueName: \"kubernetes.io/projected/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-kube-api-access-45rvb\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.163389 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.163434 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.163497 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-logs\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.163582 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-config-data\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.166206 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-logs\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.171818 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.172033 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-config-data\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.172702 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.183854 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rvb\" (UniqueName: \"kubernetes.io/projected/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-kube-api-access-45rvb\") pod \"nova-metadata-0\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.455714 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.483934 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.613320 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7b0d3ce0-49e4-4e73-b2e9-ce405a023987","Type":"ContainerStarted","Data":"5a12d88d939124a82337708df5db5dcc8679000f11357d6e00ffb3313dff7ec7"} Nov 22 09:34:26 crc kubenswrapper[4846]: I1122 09:34:26.946894 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:34:26 crc kubenswrapper[4846]: E1122 09:34:26.964363 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:34:26 crc kubenswrapper[4846]: E1122 09:34:26.968483 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:34:26 crc kubenswrapper[4846]: E1122 09:34:26.970323 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:34:26 crc kubenswrapper[4846]: E1122 09:34:26.970528 4846 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="70c528eb-ae77-41b9-8a75-b724051d88bb" containerName="nova-scheduler-scheduler" Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.626896 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061","Type":"ContainerStarted","Data":"500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3"} Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.627550 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061","Type":"ContainerStarted","Data":"ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a"} Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.627565 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061","Type":"ContainerStarted","Data":"8e6ada2fc0c5a6a9ead672f7bd019883930853b540ffc5b6c98ac4b437bf2e6f"} Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.642556 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7b0d3ce0-49e4-4e73-b2e9-ce405a023987","Type":"ContainerStarted","Data":"124c370f7416c4ee617020291a84fa49a23e5af2e861de291e3e45fd0e0f9130"} Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.642849 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.659992 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.659964268 podStartE2EDuration="2.659964268s" podCreationTimestamp="2025-11-22 09:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:27.654249672 +0000 UTC m=+1242.589939331" watchObservedRunningTime="2025-11-22 09:34:27.659964268 +0000 UTC m=+1242.595653917" Nov 22 09:34:27 crc kubenswrapper[4846]: I1122 09:34:27.672545 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.672524703 podStartE2EDuration="2.672524703s" podCreationTimestamp="2025-11-22 09:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:27.6720709 +0000 UTC m=+1242.607760549" watchObservedRunningTime="2025-11-22 09:34:27.672524703 +0000 UTC m=+1242.608214352" Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.652644 4846 generic.go:334] "Generic (PLEG): container finished" podID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerID="715dd1f0953d1afce0f977703f9fc9def739067ea07ede9777ba184f315bbf44" exitCode=0 Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.653621 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e9b69b9-98ae-4e5c-bc88-b960a615ee03","Type":"ContainerDied","Data":"715dd1f0953d1afce0f977703f9fc9def739067ea07ede9777ba184f315bbf44"} Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.777233 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.946483 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-logs\") pod \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.946819 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbpm8\" (UniqueName: \"kubernetes.io/projected/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-kube-api-access-vbpm8\") pod \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.946980 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle\") pod \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.947156 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-config-data\") pod \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.947604 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-logs" (OuterVolumeSpecName: "logs") pod "0e9b69b9-98ae-4e5c-bc88-b960a615ee03" (UID: "0e9b69b9-98ae-4e5c-bc88-b960a615ee03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.959345 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-kube-api-access-vbpm8" (OuterVolumeSpecName: "kube-api-access-vbpm8") pod "0e9b69b9-98ae-4e5c-bc88-b960a615ee03" (UID: "0e9b69b9-98ae-4e5c-bc88-b960a615ee03"). InnerVolumeSpecName "kube-api-access-vbpm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:28 crc kubenswrapper[4846]: E1122 09:34:28.989163 4846 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle podName:0e9b69b9-98ae-4e5c-bc88-b960a615ee03 nodeName:}" failed. No retries permitted until 2025-11-22 09:34:29.489131744 +0000 UTC m=+1244.424821403 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle") pod "0e9b69b9-98ae-4e5c-bc88-b960a615ee03" (UID: "0e9b69b9-98ae-4e5c-bc88-b960a615ee03") : error deleting /var/lib/kubelet/pods/0e9b69b9-98ae-4e5c-bc88-b960a615ee03/volume-subpaths: remove /var/lib/kubelet/pods/0e9b69b9-98ae-4e5c-bc88-b960a615ee03/volume-subpaths: no such file or directory Nov 22 09:34:28 crc kubenswrapper[4846]: I1122 09:34:28.995390 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-config-data" (OuterVolumeSpecName: "config-data") pod "0e9b69b9-98ae-4e5c-bc88-b960a615ee03" (UID: "0e9b69b9-98ae-4e5c-bc88-b960a615ee03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.050399 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.050444 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbpm8\" (UniqueName: \"kubernetes.io/projected/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-kube-api-access-vbpm8\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.050459 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.501297 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle\") pod \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\" (UID: \"0e9b69b9-98ae-4e5c-bc88-b960a615ee03\") " Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.512648 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e9b69b9-98ae-4e5c-bc88-b960a615ee03" (UID: "0e9b69b9-98ae-4e5c-bc88-b960a615ee03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.604937 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9b69b9-98ae-4e5c-bc88-b960a615ee03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.671296 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e9b69b9-98ae-4e5c-bc88-b960a615ee03","Type":"ContainerDied","Data":"fc87464ea66c04c8540c22746cd653d7aca1ede0679e022fda621348835c7957"} Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.671310 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.671415 4846 scope.go:117] "RemoveContainer" containerID="715dd1f0953d1afce0f977703f9fc9def739067ea07ede9777ba184f315bbf44" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.673648 4846 generic.go:334] "Generic (PLEG): container finished" podID="70c528eb-ae77-41b9-8a75-b724051d88bb" containerID="4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a" exitCode=0 Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.673691 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70c528eb-ae77-41b9-8a75-b724051d88bb","Type":"ContainerDied","Data":"4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a"} Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.695627 4846 scope.go:117] "RemoveContainer" containerID="6fc1e247eaf5d86a1288a1c211b291a229e4caf17809f8ceee1ccf91eeaaaa79" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.740571 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.750106 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.761422 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:29 crc kubenswrapper[4846]: E1122 09:34:29.762025 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-api" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.762059 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-api" Nov 22 09:34:29 crc kubenswrapper[4846]: E1122 09:34:29.762076 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-log" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.762083 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-log" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.762300 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-log" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.762328 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" containerName="nova-api-api" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.763554 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.769756 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.770581 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.811649 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-config-data\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.811842 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-885hx\" (UniqueName: \"kubernetes.io/projected/a9829e4b-0ffa-4156-822e-51682b8e3634-kube-api-access-885hx\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.811920 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9829e4b-0ffa-4156-822e-51682b8e3634-logs\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.811953 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.914560 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9829e4b-0ffa-4156-822e-51682b8e3634-logs\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.915154 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.915360 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9829e4b-0ffa-4156-822e-51682b8e3634-logs\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.915545 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-config-data\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.915882 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-885hx\" (UniqueName: \"kubernetes.io/projected/a9829e4b-0ffa-4156-822e-51682b8e3634-kube-api-access-885hx\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.922303 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.922415 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-config-data\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:29 crc kubenswrapper[4846]: I1122 09:34:29.941708 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-885hx\" (UniqueName: \"kubernetes.io/projected/a9829e4b-0ffa-4156-822e-51682b8e3634-kube-api-access-885hx\") pod \"nova-api-0\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " pod="openstack/nova-api-0" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.047100 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9b69b9-98ae-4e5c-bc88-b960a615ee03" path="/var/lib/kubelet/pods/0e9b69b9-98ae-4e5c-bc88-b960a615ee03/volumes" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.094433 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.571431 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.634809 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-config-data\") pod \"70c528eb-ae77-41b9-8a75-b724051d88bb\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.635146 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4qfd\" (UniqueName: \"kubernetes.io/projected/70c528eb-ae77-41b9-8a75-b724051d88bb-kube-api-access-h4qfd\") pod \"70c528eb-ae77-41b9-8a75-b724051d88bb\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.635173 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-combined-ca-bundle\") pod \"70c528eb-ae77-41b9-8a75-b724051d88bb\" (UID: \"70c528eb-ae77-41b9-8a75-b724051d88bb\") " Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.644444 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c528eb-ae77-41b9-8a75-b724051d88bb-kube-api-access-h4qfd" (OuterVolumeSpecName: "kube-api-access-h4qfd") pod "70c528eb-ae77-41b9-8a75-b724051d88bb" (UID: "70c528eb-ae77-41b9-8a75-b724051d88bb"). InnerVolumeSpecName "kube-api-access-h4qfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.679343 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.708115 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c528eb-ae77-41b9-8a75-b724051d88bb" (UID: "70c528eb-ae77-41b9-8a75-b724051d88bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.726739 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70c528eb-ae77-41b9-8a75-b724051d88bb","Type":"ContainerDied","Data":"f89fd2ebfbd87d0bb5a06974723c7bd99633c9cf21e39dace556c416ef7f99fa"} Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.726802 4846 scope.go:117] "RemoveContainer" containerID="4ad6d3e06eab1993791cc1f9b8769c6bf38190ba63c90738c2d8b795d24b993a" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.727007 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.736343 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-config-data" (OuterVolumeSpecName: "config-data") pod "70c528eb-ae77-41b9-8a75-b724051d88bb" (UID: "70c528eb-ae77-41b9-8a75-b724051d88bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.738174 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.738209 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4qfd\" (UniqueName: \"kubernetes.io/projected/70c528eb-ae77-41b9-8a75-b724051d88bb-kube-api-access-h4qfd\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:30 crc kubenswrapper[4846]: I1122 09:34:30.738225 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c528eb-ae77-41b9-8a75-b724051d88bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.052748 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.067377 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.077447 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.094300 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:31 crc kubenswrapper[4846]: E1122 09:34:31.094839 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c528eb-ae77-41b9-8a75-b724051d88bb" containerName="nova-scheduler-scheduler" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.094867 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c528eb-ae77-41b9-8a75-b724051d88bb" containerName="nova-scheduler-scheduler" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.095095 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c528eb-ae77-41b9-8a75-b724051d88bb" containerName="nova-scheduler-scheduler" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.095957 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.103398 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.129479 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.147183 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqlvr\" (UniqueName: \"kubernetes.io/projected/485d8018-77aa-40ce-8978-8126c84202ac-kube-api-access-nqlvr\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.147261 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-config-data\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.147282 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.270769 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqlvr\" (UniqueName: \"kubernetes.io/projected/485d8018-77aa-40ce-8978-8126c84202ac-kube-api-access-nqlvr\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.271775 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-config-data\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.272650 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.275691 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-config-data\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.275898 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.292783 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqlvr\" (UniqueName: \"kubernetes.io/projected/485d8018-77aa-40ce-8978-8126c84202ac-kube-api-access-nqlvr\") pod \"nova-scheduler-0\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.418573 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.456381 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.457249 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.775527 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9829e4b-0ffa-4156-822e-51682b8e3634","Type":"ContainerStarted","Data":"82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66"} Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.776011 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9829e4b-0ffa-4156-822e-51682b8e3634","Type":"ContainerStarted","Data":"e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189"} Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.776029 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9829e4b-0ffa-4156-822e-51682b8e3634","Type":"ContainerStarted","Data":"b2b6325dec7c2e30a2bc6f021c32d3ee6f813c098425b2c9db9d95d9a071d4c4"} Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.801599 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.801578599 podStartE2EDuration="2.801578599s" podCreationTimestamp="2025-11-22 09:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:31.79402233 +0000 UTC m=+1246.729711979" watchObservedRunningTime="2025-11-22 09:34:31.801578599 +0000 UTC m=+1246.737268248" Nov 22 09:34:31 crc kubenswrapper[4846]: I1122 09:34:31.945326 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:34:32 crc kubenswrapper[4846]: I1122 09:34:32.049977 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c528eb-ae77-41b9-8a75-b724051d88bb" path="/var/lib/kubelet/pods/70c528eb-ae77-41b9-8a75-b724051d88bb/volumes" Nov 22 09:34:32 crc kubenswrapper[4846]: I1122 09:34:32.790559 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485d8018-77aa-40ce-8978-8126c84202ac","Type":"ContainerStarted","Data":"6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656"} Nov 22 09:34:32 crc kubenswrapper[4846]: I1122 09:34:32.790607 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485d8018-77aa-40ce-8978-8126c84202ac","Type":"ContainerStarted","Data":"ec4f714d6a8083e950bc7e61a4e49148b4013beb78e691c737f3564e48480221"} Nov 22 09:34:32 crc kubenswrapper[4846]: I1122 09:34:32.809188 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.809168397 podStartE2EDuration="1.809168397s" podCreationTimestamp="2025-11-22 09:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:32.805920803 +0000 UTC m=+1247.741610462" watchObservedRunningTime="2025-11-22 09:34:32.809168397 +0000 UTC m=+1247.744858046" Nov 22 09:34:33 crc kubenswrapper[4846]: I1122 09:34:33.429340 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 09:34:36 crc kubenswrapper[4846]: I1122 09:34:36.419712 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:34:36 crc kubenswrapper[4846]: I1122 09:34:36.456340 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:34:36 crc kubenswrapper[4846]: I1122 09:34:36.456411 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:34:37 crc kubenswrapper[4846]: I1122 09:34:37.465298 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:34:37 crc kubenswrapper[4846]: I1122 09:34:37.475334 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:34:40 crc kubenswrapper[4846]: I1122 09:34:40.095113 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:34:40 crc kubenswrapper[4846]: I1122 09:34:40.095234 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:34:41 crc kubenswrapper[4846]: I1122 09:34:41.178269 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:34:41 crc kubenswrapper[4846]: I1122 09:34:41.178275 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:34:41 crc kubenswrapper[4846]: I1122 09:34:41.419504 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 09:34:41 crc kubenswrapper[4846]: I1122 09:34:41.474889 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 09:34:42 crc kubenswrapper[4846]: I1122 09:34:42.227854 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 09:34:46 crc kubenswrapper[4846]: I1122 09:34:46.468241 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:34:46 crc kubenswrapper[4846]: I1122 09:34:46.471362 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:34:46 crc kubenswrapper[4846]: I1122 09:34:46.484341 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:34:47 crc kubenswrapper[4846]: I1122 09:34:47.255787 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:34:48 crc kubenswrapper[4846]: I1122 09:34:48.903551 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.034729 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5wlj\" (UniqueName: \"kubernetes.io/projected/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-kube-api-access-r5wlj\") pod \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.034976 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-combined-ca-bundle\") pod \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.035069 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-config-data\") pod \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\" (UID: \"f7df46c0-c0b8-401d-95ba-5b42afc7b06c\") " Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.045026 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-kube-api-access-r5wlj" (OuterVolumeSpecName: "kube-api-access-r5wlj") pod "f7df46c0-c0b8-401d-95ba-5b42afc7b06c" (UID: "f7df46c0-c0b8-401d-95ba-5b42afc7b06c"). InnerVolumeSpecName "kube-api-access-r5wlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.081271 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-config-data" (OuterVolumeSpecName: "config-data") pod "f7df46c0-c0b8-401d-95ba-5b42afc7b06c" (UID: "f7df46c0-c0b8-401d-95ba-5b42afc7b06c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.087048 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7df46c0-c0b8-401d-95ba-5b42afc7b06c" (UID: "f7df46c0-c0b8-401d-95ba-5b42afc7b06c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.138745 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.138945 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.140219 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5wlj\" (UniqueName: \"kubernetes.io/projected/f7df46c0-c0b8-401d-95ba-5b42afc7b06c-kube-api-access-r5wlj\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.273645 4846 generic.go:334] "Generic (PLEG): container finished" podID="f7df46c0-c0b8-401d-95ba-5b42afc7b06c" containerID="5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82" exitCode=137 Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.273856 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.273896 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7df46c0-c0b8-401d-95ba-5b42afc7b06c","Type":"ContainerDied","Data":"5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82"} Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.274399 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7df46c0-c0b8-401d-95ba-5b42afc7b06c","Type":"ContainerDied","Data":"b17051b2f886cbc058dbb1151cb4f61f0aa0d6995133d538d765486a1ac74222"} Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.274430 4846 scope.go:117] "RemoveContainer" containerID="5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.309503 4846 scope.go:117] "RemoveContainer" containerID="5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82" Nov 22 09:34:49 crc kubenswrapper[4846]: E1122 09:34:49.310404 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82\": container with ID starting with 5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82 not found: ID does not exist" containerID="5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.310495 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82"} err="failed to get container status \"5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82\": rpc error: code = NotFound desc = could not find container \"5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82\": container with ID starting with 5e25b902845cf3b8f0295c01d49eab5e1efdacc40ba1f7e1808964fef4e56e82 not found: ID does not exist" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.347197 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.358929 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.373706 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:49 crc kubenswrapper[4846]: E1122 09:34:49.374388 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7df46c0-c0b8-401d-95ba-5b42afc7b06c" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.374410 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7df46c0-c0b8-401d-95ba-5b42afc7b06c" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.374737 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7df46c0-c0b8-401d-95ba-5b42afc7b06c" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.376189 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.379960 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.380807 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.381613 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.388900 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.548975 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.549144 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.549228 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.549394 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rgtr\" (UniqueName: \"kubernetes.io/projected/faa4297f-4d7b-4942-958c-ccc0f3891f2a-kube-api-access-7rgtr\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.549640 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.651881 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.652014 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.652114 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.652218 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rgtr\" (UniqueName: \"kubernetes.io/projected/faa4297f-4d7b-4942-958c-ccc0f3891f2a-kube-api-access-7rgtr\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.652457 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.661390 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.661602 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.662827 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.663930 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa4297f-4d7b-4942-958c-ccc0f3891f2a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.675104 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rgtr\" (UniqueName: \"kubernetes.io/projected/faa4297f-4d7b-4942-958c-ccc0f3891f2a-kube-api-access-7rgtr\") pod \"nova-cell1-novncproxy-0\" (UID: \"faa4297f-4d7b-4942-958c-ccc0f3891f2a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:49 crc kubenswrapper[4846]: I1122 09:34:49.705446 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.021962 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:34:50 crc kubenswrapper[4846]: W1122 09:34:50.024949 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaa4297f_4d7b_4942_958c_ccc0f3891f2a.slice/crio-2f07fa274ded97cce07c418012dbb02c3e722b1b3e53457613f45cc7d3c3ebe5 WatchSource:0}: Error finding container 2f07fa274ded97cce07c418012dbb02c3e722b1b3e53457613f45cc7d3c3ebe5: Status 404 returned error can't find the container with id 2f07fa274ded97cce07c418012dbb02c3e722b1b3e53457613f45cc7d3c3ebe5 Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.052943 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7df46c0-c0b8-401d-95ba-5b42afc7b06c" path="/var/lib/kubelet/pods/f7df46c0-c0b8-401d-95ba-5b42afc7b06c/volumes" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.101364 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.102081 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.104780 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.108339 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.287696 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"faa4297f-4d7b-4942-958c-ccc0f3891f2a","Type":"ContainerStarted","Data":"2f07fa274ded97cce07c418012dbb02c3e722b1b3e53457613f45cc7d3c3ebe5"} Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.292552 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.299694 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.537646 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-wbtbs"] Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.539466 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.616097 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-wbtbs"] Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.679185 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-config\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.679303 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.679380 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnnc\" (UniqueName: \"kubernetes.io/projected/defce30c-d2ab-4153-91de-c76acd4c3529-kube-api-access-hpnnc\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.679424 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.679470 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.679492 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.781937 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnnc\" (UniqueName: \"kubernetes.io/projected/defce30c-d2ab-4153-91de-c76acd4c3529-kube-api-access-hpnnc\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.782091 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.783175 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.783198 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.783248 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.783291 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.784037 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.784931 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-config\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.784982 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-config\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.785817 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.785864 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.816222 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnnc\" (UniqueName: \"kubernetes.io/projected/defce30c-d2ab-4153-91de-c76acd4c3529-kube-api-access-hpnnc\") pod \"dnsmasq-dns-89c5cd4d5-wbtbs\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:50 crc kubenswrapper[4846]: I1122 09:34:50.862752 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:51 crc kubenswrapper[4846]: I1122 09:34:51.189640 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-wbtbs"] Nov 22 09:34:51 crc kubenswrapper[4846]: I1122 09:34:51.306183 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"faa4297f-4d7b-4942-958c-ccc0f3891f2a","Type":"ContainerStarted","Data":"7295f4f9461880a68326c2e468cd51eb742ffeaa0d9c23b8600ae8e5df4e466f"} Nov 22 09:34:51 crc kubenswrapper[4846]: I1122 09:34:51.309815 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" event={"ID":"defce30c-d2ab-4153-91de-c76acd4c3529","Type":"ContainerStarted","Data":"971caf9fdc7eded19f8ae2abcae13ef9620b78f14c9b5014fcf18dbed09df5d8"} Nov 22 09:34:51 crc kubenswrapper[4846]: I1122 09:34:51.330276 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.330250659 podStartE2EDuration="2.330250659s" podCreationTimestamp="2025-11-22 09:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:51.323623857 +0000 UTC m=+1266.259313506" watchObservedRunningTime="2025-11-22 09:34:51.330250659 +0000 UTC m=+1266.265940328" Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.326500 4846 generic.go:334] "Generic (PLEG): container finished" podID="defce30c-d2ab-4153-91de-c76acd4c3529" containerID="4b5f0b985023954e84079c6d18a6d0b439d3df08561d980739456063aeaf0b62" exitCode=0 Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.329240 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" event={"ID":"defce30c-d2ab-4153-91de-c76acd4c3529","Type":"ContainerDied","Data":"4b5f0b985023954e84079c6d18a6d0b439d3df08561d980739456063aeaf0b62"} Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.927499 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.929709 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-central-agent" containerID="cri-o://ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9" gracePeriod=30 Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.929798 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-notification-agent" containerID="cri-o://24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675" gracePeriod=30 Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.929807 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="proxy-httpd" containerID="cri-o://9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487" gracePeriod=30 Nov 22 09:34:52 crc kubenswrapper[4846]: I1122 09:34:52.929808 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="sg-core" containerID="cri-o://d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362" gracePeriod=30 Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.204891 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.343707 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" event={"ID":"defce30c-d2ab-4153-91de-c76acd4c3529","Type":"ContainerStarted","Data":"f249fe84a9c2f30e2d5ecec9b5b3cbeb411fba19b8045fe57eec7f3aaad3dcbd"} Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.344691 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.348936 4846 generic.go:334] "Generic (PLEG): container finished" podID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerID="9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487" exitCode=0 Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.349087 4846 generic.go:334] "Generic (PLEG): container finished" podID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerID="d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362" exitCode=2 Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.349237 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerDied","Data":"9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487"} Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.349304 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerDied","Data":"d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362"} Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.349690 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-log" containerID="cri-o://e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189" gracePeriod=30 Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.349749 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-api" containerID="cri-o://82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66" gracePeriod=30 Nov 22 09:34:53 crc kubenswrapper[4846]: I1122 09:34:53.379427 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" podStartSLOduration=3.37940102 podStartE2EDuration="3.37940102s" podCreationTimestamp="2025-11-22 09:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:53.365932079 +0000 UTC m=+1268.301621728" watchObservedRunningTime="2025-11-22 09:34:53.37940102 +0000 UTC m=+1268.315090669" Nov 22 09:34:54 crc kubenswrapper[4846]: I1122 09:34:54.366188 4846 generic.go:334] "Generic (PLEG): container finished" podID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerID="ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9" exitCode=0 Nov 22 09:34:54 crc kubenswrapper[4846]: I1122 09:34:54.366688 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerDied","Data":"ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9"} Nov 22 09:34:54 crc kubenswrapper[4846]: I1122 09:34:54.369919 4846 generic.go:334] "Generic (PLEG): container finished" podID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerID="e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189" exitCode=143 Nov 22 09:34:54 crc kubenswrapper[4846]: I1122 09:34:54.369979 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9829e4b-0ffa-4156-822e-51682b8e3634","Type":"ContainerDied","Data":"e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189"} Nov 22 09:34:54 crc kubenswrapper[4846]: I1122 09:34:54.706619 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.298950 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.411735 4846 generic.go:334] "Generic (PLEG): container finished" podID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerID="24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675" exitCode=0 Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.411788 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerDied","Data":"24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675"} Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.411832 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3419e190-c6c9-409c-8c77-0ab4c20dee33","Type":"ContainerDied","Data":"5c51789b85f5041792af2c22f760e7f8244962f91776fbc98b70728aabb89efd"} Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.411840 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.411853 4846 scope.go:117] "RemoveContainer" containerID="9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422228 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-combined-ca-bundle\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422285 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-config-data\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422325 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-log-httpd\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422350 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-run-httpd\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422399 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnz6h\" (UniqueName: \"kubernetes.io/projected/3419e190-c6c9-409c-8c77-0ab4c20dee33-kube-api-access-mnz6h\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422561 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-scripts\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422590 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-ceilometer-tls-certs\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.422613 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-sg-core-conf-yaml\") pod \"3419e190-c6c9-409c-8c77-0ab4c20dee33\" (UID: \"3419e190-c6c9-409c-8c77-0ab4c20dee33\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.424434 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.426338 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.433909 4846 scope.go:117] "RemoveContainer" containerID="d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.439333 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-scripts" (OuterVolumeSpecName: "scripts") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.447328 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3419e190-c6c9-409c-8c77-0ab4c20dee33-kube-api-access-mnz6h" (OuterVolumeSpecName: "kube-api-access-mnz6h") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "kube-api-access-mnz6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.464678 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.488551 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.526098 4846 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.526171 4846 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3419e190-c6c9-409c-8c77-0ab4c20dee33-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.526186 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnz6h\" (UniqueName: \"kubernetes.io/projected/3419e190-c6c9-409c-8c77-0ab4c20dee33-kube-api-access-mnz6h\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.526205 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.526218 4846 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.526229 4846 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.544331 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.544912 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-config-data" (OuterVolumeSpecName: "config-data") pod "3419e190-c6c9-409c-8c77-0ab4c20dee33" (UID: "3419e190-c6c9-409c-8c77-0ab4c20dee33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.609758 4846 scope.go:117] "RemoveContainer" containerID="24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.628542 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.628582 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3419e190-c6c9-409c-8c77-0ab4c20dee33-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.644659 4846 scope.go:117] "RemoveContainer" containerID="ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.666902 4846 scope.go:117] "RemoveContainer" containerID="9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.671739 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487\": container with ID starting with 9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487 not found: ID does not exist" containerID="9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.671800 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487"} err="failed to get container status \"9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487\": rpc error: code = NotFound desc = could not find container \"9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487\": container with ID starting with 9251c611b0577136523e9d70da75c4531993e737646ac26a98cff14537a84487 not found: ID does not exist" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.671837 4846 scope.go:117] "RemoveContainer" containerID="d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.682694 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362\": container with ID starting with d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362 not found: ID does not exist" containerID="d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.682744 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362"} err="failed to get container status \"d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362\": rpc error: code = NotFound desc = could not find container \"d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362\": container with ID starting with d9de7d607e2e750aaa2b3f5e01a6c6a5a918199d5669ab3236234d53c4141362 not found: ID does not exist" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.682771 4846 scope.go:117] "RemoveContainer" containerID="24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.685170 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675\": container with ID starting with 24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675 not found: ID does not exist" containerID="24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.685203 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675"} err="failed to get container status \"24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675\": rpc error: code = NotFound desc = could not find container \"24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675\": container with ID starting with 24d9a265db2bfc18f663f7623cd040b1467265148e0e67e1461a5a82f971d675 not found: ID does not exist" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.685218 4846 scope.go:117] "RemoveContainer" containerID="ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.685871 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9\": container with ID starting with ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9 not found: ID does not exist" containerID="ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.685945 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9"} err="failed to get container status \"ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9\": rpc error: code = NotFound desc = could not find container \"ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9\": container with ID starting with ff16664ff33c58cadd1401efd5ae09e45793509dee4ffe75d0d4c1402fa689c9 not found: ID does not exist" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.767959 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.783311 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.801829 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.802383 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-central-agent" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802404 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-central-agent" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.802433 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="sg-core" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802439 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="sg-core" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.802459 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-notification-agent" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802466 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-notification-agent" Nov 22 09:34:56 crc kubenswrapper[4846]: E1122 09:34:56.802481 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="proxy-httpd" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802488 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="proxy-httpd" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802728 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-notification-agent" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802744 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="proxy-httpd" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802753 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="sg-core" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.802762 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" containerName="ceilometer-central-agent" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.811179 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.814557 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.814609 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.814861 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.817965 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.869608 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.934206 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-config-data\") pod \"a9829e4b-0ffa-4156-822e-51682b8e3634\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.934398 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-combined-ca-bundle\") pod \"a9829e4b-0ffa-4156-822e-51682b8e3634\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.934461 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-885hx\" (UniqueName: \"kubernetes.io/projected/a9829e4b-0ffa-4156-822e-51682b8e3634-kube-api-access-885hx\") pod \"a9829e4b-0ffa-4156-822e-51682b8e3634\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.934699 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9829e4b-0ffa-4156-822e-51682b8e3634-logs\") pod \"a9829e4b-0ffa-4156-822e-51682b8e3634\" (UID: \"a9829e4b-0ffa-4156-822e-51682b8e3634\") " Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935218 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fvf\" (UniqueName: \"kubernetes.io/projected/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-kube-api-access-v5fvf\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935279 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-scripts\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935359 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935396 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-run-httpd\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935429 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-log-httpd\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935460 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-config-data\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935485 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.935530 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.937542 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9829e4b-0ffa-4156-822e-51682b8e3634-logs" (OuterVolumeSpecName: "logs") pod "a9829e4b-0ffa-4156-822e-51682b8e3634" (UID: "a9829e4b-0ffa-4156-822e-51682b8e3634"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.941983 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9829e4b-0ffa-4156-822e-51682b8e3634-kube-api-access-885hx" (OuterVolumeSpecName: "kube-api-access-885hx") pod "a9829e4b-0ffa-4156-822e-51682b8e3634" (UID: "a9829e4b-0ffa-4156-822e-51682b8e3634"). InnerVolumeSpecName "kube-api-access-885hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.968092 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-config-data" (OuterVolumeSpecName: "config-data") pod "a9829e4b-0ffa-4156-822e-51682b8e3634" (UID: "a9829e4b-0ffa-4156-822e-51682b8e3634"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:56 crc kubenswrapper[4846]: I1122 09:34:56.983238 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9829e4b-0ffa-4156-822e-51682b8e3634" (UID: "a9829e4b-0ffa-4156-822e-51682b8e3634"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038416 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fvf\" (UniqueName: \"kubernetes.io/projected/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-kube-api-access-v5fvf\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038478 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-scripts\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038542 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038570 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-run-httpd\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038602 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-log-httpd\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038624 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-config-data\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038645 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038683 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038806 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038821 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829e4b-0ffa-4156-822e-51682b8e3634-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038837 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-885hx\" (UniqueName: \"kubernetes.io/projected/a9829e4b-0ffa-4156-822e-51682b8e3634-kube-api-access-885hx\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.038849 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9829e4b-0ffa-4156-822e-51682b8e3634-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.041039 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-log-httpd\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.041737 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-run-httpd\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.044101 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-scripts\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.045459 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-config-data\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.046091 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.046714 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.047308 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.065556 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fvf\" (UniqueName: \"kubernetes.io/projected/b8240da2-e07e-4b79-81b7-4dffdf4b4c91-kube-api-access-v5fvf\") pod \"ceilometer-0\" (UID: \"b8240da2-e07e-4b79-81b7-4dffdf4b4c91\") " pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.149475 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.434218 4846 generic.go:334] "Generic (PLEG): container finished" podID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerID="82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66" exitCode=0 Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.434284 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.434303 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9829e4b-0ffa-4156-822e-51682b8e3634","Type":"ContainerDied","Data":"82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66"} Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.434366 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9829e4b-0ffa-4156-822e-51682b8e3634","Type":"ContainerDied","Data":"b2b6325dec7c2e30a2bc6f021c32d3ee6f813c098425b2c9db9d95d9a071d4c4"} Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.434395 4846 scope.go:117] "RemoveContainer" containerID="82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.475701 4846 scope.go:117] "RemoveContainer" containerID="e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.476602 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.488545 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.508300 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:57 crc kubenswrapper[4846]: E1122 09:34:57.508854 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-api" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.508908 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-api" Nov 22 09:34:57 crc kubenswrapper[4846]: E1122 09:34:57.508941 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-log" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.508954 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-log" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.509247 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-log" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.509284 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" containerName="nova-api-api" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.510745 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.518381 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.518582 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.519965 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.520027 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.526254 4846 scope.go:117] "RemoveContainer" containerID="82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66" Nov 22 09:34:57 crc kubenswrapper[4846]: E1122 09:34:57.530279 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66\": container with ID starting with 82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66 not found: ID does not exist" containerID="82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.530322 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66"} err="failed to get container status \"82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66\": rpc error: code = NotFound desc = could not find container \"82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66\": container with ID starting with 82afebf276e736941b3e4a7c210aa375c2c633ca5f6f877cdcb33cfc7b099e66 not found: ID does not exist" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.530351 4846 scope.go:117] "RemoveContainer" containerID="e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189" Nov 22 09:34:57 crc kubenswrapper[4846]: E1122 09:34:57.538014 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189\": container with ID starting with e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189 not found: ID does not exist" containerID="e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.538176 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189"} err="failed to get container status \"e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189\": rpc error: code = NotFound desc = could not find container \"e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189\": container with ID starting with e441d61df55054554492bb5194df6f9beefa58602f51264c127eb33117581189 not found: ID does not exist" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.658068 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-config-data\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.658157 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzdh\" (UniqueName: \"kubernetes.io/projected/8a3e58bf-db06-4335-8193-55e1b109d0de-kube-api-access-hqzdh\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.658189 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.658413 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.658737 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3e58bf-db06-4335-8193-55e1b109d0de-logs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.658930 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: W1122 09:34:57.687586 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8240da2_e07e_4b79_81b7_4dffdf4b4c91.slice/crio-2f87dd617cdd3f6658cad498c1ac443df3f3e33f28290f635fd46bcaf4869918 WatchSource:0}: Error finding container 2f87dd617cdd3f6658cad498c1ac443df3f3e33f28290f635fd46bcaf4869918: Status 404 returned error can't find the container with id 2f87dd617cdd3f6658cad498c1ac443df3f3e33f28290f635fd46bcaf4869918 Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.689413 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.694307 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.761736 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzdh\" (UniqueName: \"kubernetes.io/projected/8a3e58bf-db06-4335-8193-55e1b109d0de-kube-api-access-hqzdh\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.761794 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.761864 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.761921 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3e58bf-db06-4335-8193-55e1b109d0de-logs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.761967 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.762013 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-config-data\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.762669 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3e58bf-db06-4335-8193-55e1b109d0de-logs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.780400 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.780730 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.781197 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-public-tls-certs\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.784474 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-config-data\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.792020 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzdh\" (UniqueName: \"kubernetes.io/projected/8a3e58bf-db06-4335-8193-55e1b109d0de-kube-api-access-hqzdh\") pod \"nova-api-0\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " pod="openstack/nova-api-0" Nov 22 09:34:57 crc kubenswrapper[4846]: I1122 09:34:57.846335 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:34:58 crc kubenswrapper[4846]: I1122 09:34:58.067910 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3419e190-c6c9-409c-8c77-0ab4c20dee33" path="/var/lib/kubelet/pods/3419e190-c6c9-409c-8c77-0ab4c20dee33/volumes" Nov 22 09:34:58 crc kubenswrapper[4846]: I1122 09:34:58.069907 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9829e4b-0ffa-4156-822e-51682b8e3634" path="/var/lib/kubelet/pods/a9829e4b-0ffa-4156-822e-51682b8e3634/volumes" Nov 22 09:34:58 crc kubenswrapper[4846]: I1122 09:34:58.339448 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:34:58 crc kubenswrapper[4846]: I1122 09:34:58.464913 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a3e58bf-db06-4335-8193-55e1b109d0de","Type":"ContainerStarted","Data":"75b894522c586a0e275921a5b028fd0dce2370a841cdebbe2e34fc56890f328c"} Nov 22 09:34:58 crc kubenswrapper[4846]: I1122 09:34:58.472359 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8240da2-e07e-4b79-81b7-4dffdf4b4c91","Type":"ContainerStarted","Data":"2f87dd617cdd3f6658cad498c1ac443df3f3e33f28290f635fd46bcaf4869918"} Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.486319 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8240da2-e07e-4b79-81b7-4dffdf4b4c91","Type":"ContainerStarted","Data":"0914fbaa1679fc9ff8d4bebe2f8b122cd6ad3840724303543a3a25ef3119387d"} Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.488375 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8240da2-e07e-4b79-81b7-4dffdf4b4c91","Type":"ContainerStarted","Data":"edc98640c18884b5aba213255ba37a3d93397233ed406020294c9a13a71cffdc"} Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.490862 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a3e58bf-db06-4335-8193-55e1b109d0de","Type":"ContainerStarted","Data":"36802b5be6a63763b03548bdc20aa224351a9117d7dd2fb0392b7d5d0b1c2a64"} Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.490980 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a3e58bf-db06-4335-8193-55e1b109d0de","Type":"ContainerStarted","Data":"9364d643445c4a1c9cb1400f2ee8b1d0aeff33c8e3b586cbcb2bd398ced15413"} Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.517705 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.517673299 podStartE2EDuration="2.517673299s" podCreationTimestamp="2025-11-22 09:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:34:59.514724274 +0000 UTC m=+1274.450413933" watchObservedRunningTime="2025-11-22 09:34:59.517673299 +0000 UTC m=+1274.453362958" Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.705704 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:34:59 crc kubenswrapper[4846]: I1122 09:34:59.738784 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.502344 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8240da2-e07e-4b79-81b7-4dffdf4b4c91","Type":"ContainerStarted","Data":"68f26e7b56a36902d54f7efd1a0f78e1372b478e5b3e136b34416ebd9fd801d3"} Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.526206 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.709966 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-lvfpk"] Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.713747 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.719154 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lvfpk"] Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.719665 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.719919 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.745882 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-config-data\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.746171 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.746237 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64f77\" (UniqueName: \"kubernetes.io/projected/b782bb33-f288-4cea-8e09-f89cf68c5154-kube-api-access-64f77\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.746418 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-scripts\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.847854 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-config-data\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.847972 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.848010 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64f77\" (UniqueName: \"kubernetes.io/projected/b782bb33-f288-4cea-8e09-f89cf68c5154-kube-api-access-64f77\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.848091 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-scripts\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.855907 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-config-data\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.856588 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.865027 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-scripts\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.865443 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.871375 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64f77\" (UniqueName: \"kubernetes.io/projected/b782bb33-f288-4cea-8e09-f89cf68c5154-kube-api-access-64f77\") pod \"nova-cell1-cell-mapping-lvfpk\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.932903 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ljjss"] Nov 22 09:35:00 crc kubenswrapper[4846]: I1122 09:35:00.933442 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerName="dnsmasq-dns" containerID="cri-o://60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374" gracePeriod=10 Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.040761 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.429626 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.462661 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-swift-storage-0\") pod \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.462741 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-config\") pod \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.462819 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-nb\") pod \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.463014 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7hcn\" (UniqueName: \"kubernetes.io/projected/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-kube-api-access-r7hcn\") pod \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.463040 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-svc\") pod \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.463139 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-sb\") pod \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\" (UID: \"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c\") " Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.494752 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-kube-api-access-r7hcn" (OuterVolumeSpecName: "kube-api-access-r7hcn") pod "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" (UID: "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c"). InnerVolumeSpecName "kube-api-access-r7hcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.533475 4846 generic.go:334] "Generic (PLEG): container finished" podID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerID="60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374" exitCode=0 Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.536454 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" event={"ID":"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c","Type":"ContainerDied","Data":"60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374"} Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.539257 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" event={"ID":"38a874d5-f2f2-42ba-89c7-9fc4a2e7272c","Type":"ContainerDied","Data":"3accb8a8d3190d6009e25b5d976fde078453ddf8a0b1ce2565278f24e5c3bf6f"} Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.539292 4846 scope.go:117] "RemoveContainer" containerID="60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.536542 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ljjss" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.570555 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7hcn\" (UniqueName: \"kubernetes.io/projected/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-kube-api-access-r7hcn\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.612237 4846 scope.go:117] "RemoveContainer" containerID="77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.653280 4846 scope.go:117] "RemoveContainer" containerID="60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374" Nov 22 09:35:01 crc kubenswrapper[4846]: E1122 09:35:01.658148 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374\": container with ID starting with 60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374 not found: ID does not exist" containerID="60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.658194 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374"} err="failed to get container status \"60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374\": rpc error: code = NotFound desc = could not find container \"60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374\": container with ID starting with 60efb2d743b35156ddf2cbf861ed4b107264a0102882b4a50b613b27e55b5374 not found: ID does not exist" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.658222 4846 scope.go:117] "RemoveContainer" containerID="77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a" Nov 22 09:35:01 crc kubenswrapper[4846]: E1122 09:35:01.659916 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a\": container with ID starting with 77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a not found: ID does not exist" containerID="77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.659940 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a"} err="failed to get container status \"77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a\": rpc error: code = NotFound desc = could not find container \"77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a\": container with ID starting with 77434d4ac38bbc512cf4507ad9c9c53d1a18ca5b127d2cf2cce712d12ae0a49a not found: ID does not exist" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.729908 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lvfpk"] Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.920317 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" (UID: "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.935704 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" (UID: "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.939988 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" (UID: "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.949381 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" (UID: "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.951628 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-config" (OuterVolumeSpecName: "config") pod "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" (UID: "38a874d5-f2f2-42ba-89c7-9fc4a2e7272c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.984275 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.984318 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.984333 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.984348 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:01 crc kubenswrapper[4846]: I1122 09:35:01.984357 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.174775 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ljjss"] Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.182521 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ljjss"] Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.549193 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lvfpk" event={"ID":"b782bb33-f288-4cea-8e09-f89cf68c5154","Type":"ContainerStarted","Data":"aa8e32ec449b393f08938d6d53fade9a75e07cda08e61026e31045d7a266c6fb"} Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.549239 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lvfpk" event={"ID":"b782bb33-f288-4cea-8e09-f89cf68c5154","Type":"ContainerStarted","Data":"c4259db6f8704133ff25b26ebbbbbe7c2273d6390360d853cbdc88da3cf80126"} Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.568886 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8240da2-e07e-4b79-81b7-4dffdf4b4c91","Type":"ContainerStarted","Data":"8d94a9ea416b1db1a408cb09c4d6fccad98d828f34c3604a03809d140a8c5550"} Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.570603 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.581026 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-lvfpk" podStartSLOduration=2.5810042600000003 podStartE2EDuration="2.58100426s" podCreationTimestamp="2025-11-22 09:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:35:02.572980307 +0000 UTC m=+1277.508669956" watchObservedRunningTime="2025-11-22 09:35:02.58100426 +0000 UTC m=+1277.516693909" Nov 22 09:35:02 crc kubenswrapper[4846]: I1122 09:35:02.600449 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.043198363 podStartE2EDuration="6.600421064s" podCreationTimestamp="2025-11-22 09:34:56 +0000 UTC" firstStartedPulling="2025-11-22 09:34:57.693996155 +0000 UTC m=+1272.629685814" lastFinishedPulling="2025-11-22 09:35:01.251218866 +0000 UTC m=+1276.186908515" observedRunningTime="2025-11-22 09:35:02.595628825 +0000 UTC m=+1277.531318474" watchObservedRunningTime="2025-11-22 09:35:02.600421064 +0000 UTC m=+1277.536110713" Nov 22 09:35:04 crc kubenswrapper[4846]: I1122 09:35:04.067766 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" path="/var/lib/kubelet/pods/38a874d5-f2f2-42ba-89c7-9fc4a2e7272c/volumes" Nov 22 09:35:07 crc kubenswrapper[4846]: I1122 09:35:07.660358 4846 generic.go:334] "Generic (PLEG): container finished" podID="b782bb33-f288-4cea-8e09-f89cf68c5154" containerID="aa8e32ec449b393f08938d6d53fade9a75e07cda08e61026e31045d7a266c6fb" exitCode=0 Nov 22 09:35:07 crc kubenswrapper[4846]: I1122 09:35:07.660996 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lvfpk" event={"ID":"b782bb33-f288-4cea-8e09-f89cf68c5154","Type":"ContainerDied","Data":"aa8e32ec449b393f08938d6d53fade9a75e07cda08e61026e31045d7a266c6fb"} Nov 22 09:35:07 crc kubenswrapper[4846]: I1122 09:35:07.847360 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:35:07 crc kubenswrapper[4846]: I1122 09:35:07.847483 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:35:08 crc kubenswrapper[4846]: I1122 09:35:08.870570 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:35:08 crc kubenswrapper[4846]: I1122 09:35:08.870868 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.151711 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.184037 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64f77\" (UniqueName: \"kubernetes.io/projected/b782bb33-f288-4cea-8e09-f89cf68c5154-kube-api-access-64f77\") pod \"b782bb33-f288-4cea-8e09-f89cf68c5154\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.184288 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-combined-ca-bundle\") pod \"b782bb33-f288-4cea-8e09-f89cf68c5154\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.184337 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-scripts\") pod \"b782bb33-f288-4cea-8e09-f89cf68c5154\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.184482 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-config-data\") pod \"b782bb33-f288-4cea-8e09-f89cf68c5154\" (UID: \"b782bb33-f288-4cea-8e09-f89cf68c5154\") " Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.222832 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-scripts" (OuterVolumeSpecName: "scripts") pod "b782bb33-f288-4cea-8e09-f89cf68c5154" (UID: "b782bb33-f288-4cea-8e09-f89cf68c5154"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.226235 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b782bb33-f288-4cea-8e09-f89cf68c5154-kube-api-access-64f77" (OuterVolumeSpecName: "kube-api-access-64f77") pod "b782bb33-f288-4cea-8e09-f89cf68c5154" (UID: "b782bb33-f288-4cea-8e09-f89cf68c5154"). InnerVolumeSpecName "kube-api-access-64f77". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.226764 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-config-data" (OuterVolumeSpecName: "config-data") pod "b782bb33-f288-4cea-8e09-f89cf68c5154" (UID: "b782bb33-f288-4cea-8e09-f89cf68c5154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.237790 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b782bb33-f288-4cea-8e09-f89cf68c5154" (UID: "b782bb33-f288-4cea-8e09-f89cf68c5154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.287905 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64f77\" (UniqueName: \"kubernetes.io/projected/b782bb33-f288-4cea-8e09-f89cf68c5154-kube-api-access-64f77\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.287972 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.287988 4846 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.288007 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b782bb33-f288-4cea-8e09-f89cf68c5154-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.687980 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lvfpk" event={"ID":"b782bb33-f288-4cea-8e09-f89cf68c5154","Type":"ContainerDied","Data":"c4259db6f8704133ff25b26ebbbbbe7c2273d6390360d853cbdc88da3cf80126"} Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.688020 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4259db6f8704133ff25b26ebbbbbe7c2273d6390360d853cbdc88da3cf80126" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.688149 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lvfpk" Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.905519 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.906028 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-log" containerID="cri-o://9364d643445c4a1c9cb1400f2ee8b1d0aeff33c8e3b586cbcb2bd398ced15413" gracePeriod=30 Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.906359 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-api" containerID="cri-o://36802b5be6a63763b03548bdc20aa224351a9117d7dd2fb0392b7d5d0b1c2a64" gracePeriod=30 Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.933778 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.934651 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="485d8018-77aa-40ce-8978-8126c84202ac" containerName="nova-scheduler-scheduler" containerID="cri-o://6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656" gracePeriod=30 Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.944844 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.945125 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-log" containerID="cri-o://ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a" gracePeriod=30 Nov 22 09:35:09 crc kubenswrapper[4846]: I1122 09:35:09.945587 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-metadata" containerID="cri-o://500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3" gracePeriod=30 Nov 22 09:35:10 crc kubenswrapper[4846]: I1122 09:35:10.701989 4846 generic.go:334] "Generic (PLEG): container finished" podID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerID="ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a" exitCode=143 Nov 22 09:35:10 crc kubenswrapper[4846]: I1122 09:35:10.702094 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061","Type":"ContainerDied","Data":"ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a"} Nov 22 09:35:10 crc kubenswrapper[4846]: I1122 09:35:10.704792 4846 generic.go:334] "Generic (PLEG): container finished" podID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerID="9364d643445c4a1c9cb1400f2ee8b1d0aeff33c8e3b586cbcb2bd398ced15413" exitCode=143 Nov 22 09:35:10 crc kubenswrapper[4846]: I1122 09:35:10.704826 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a3e58bf-db06-4335-8193-55e1b109d0de","Type":"ContainerDied","Data":"9364d643445c4a1c9cb1400f2ee8b1d0aeff33c8e3b586cbcb2bd398ced15413"} Nov 22 09:35:11 crc kubenswrapper[4846]: E1122 09:35:11.423492 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:35:11 crc kubenswrapper[4846]: E1122 09:35:11.425701 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:35:11 crc kubenswrapper[4846]: E1122 09:35:11.427808 4846 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:35:11 crc kubenswrapper[4846]: E1122 09:35:11.427903 4846 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="485d8018-77aa-40ce-8978-8126c84202ac" containerName="nova-scheduler-scheduler" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.095735 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:41160->10.217.0.194:8775: read: connection reset by peer" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.095735 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:41172->10.217.0.194:8775: read: connection reset by peer" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.624497 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.687930 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rvb\" (UniqueName: \"kubernetes.io/projected/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-kube-api-access-45rvb\") pod \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.688620 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-logs\") pod \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.688746 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-nova-metadata-tls-certs\") pod \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.688772 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-combined-ca-bundle\") pod \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.688901 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-config-data\") pod \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\" (UID: \"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061\") " Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.689407 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-logs" (OuterVolumeSpecName: "logs") pod "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" (UID: "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.717007 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-kube-api-access-45rvb" (OuterVolumeSpecName: "kube-api-access-45rvb") pod "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" (UID: "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061"). InnerVolumeSpecName "kube-api-access-45rvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.722189 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-config-data" (OuterVolumeSpecName: "config-data") pod "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" (UID: "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.734471 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" (UID: "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.762844 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" (UID: "b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.766751 4846 generic.go:334] "Generic (PLEG): container finished" podID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerID="500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3" exitCode=0 Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.766807 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061","Type":"ContainerDied","Data":"500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3"} Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.766844 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061","Type":"ContainerDied","Data":"8e6ada2fc0c5a6a9ead672f7bd019883930853b540ffc5b6c98ac4b437bf2e6f"} Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.766865 4846 scope.go:117] "RemoveContainer" containerID="500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.767105 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.795765 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.795813 4846 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.795835 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.795858 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.795874 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rvb\" (UniqueName: \"kubernetes.io/projected/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061-kube-api-access-45rvb\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.839883 4846 scope.go:117] "RemoveContainer" containerID="ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.868249 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892224 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892299 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.892667 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerName="dnsmasq-dns" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892681 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerName="dnsmasq-dns" Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.892703 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-log" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892712 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-log" Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.892729 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b782bb33-f288-4cea-8e09-f89cf68c5154" containerName="nova-manage" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892736 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="b782bb33-f288-4cea-8e09-f89cf68c5154" containerName="nova-manage" Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.892745 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-metadata" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892752 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-metadata" Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.892769 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerName="init" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892774 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerName="init" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892961 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-metadata" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892976 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="b782bb33-f288-4cea-8e09-f89cf68c5154" containerName="nova-manage" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.892995 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a874d5-f2f2-42ba-89c7-9fc4a2e7272c" containerName="dnsmasq-dns" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.893005 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" containerName="nova-metadata-log" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.893975 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.894066 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.909227 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.910869 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.933659 4846 scope.go:117] "RemoveContainer" containerID="500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3" Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.935702 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3\": container with ID starting with 500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3 not found: ID does not exist" containerID="500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.935747 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3"} err="failed to get container status \"500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3\": rpc error: code = NotFound desc = could not find container \"500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3\": container with ID starting with 500cadf8a53be1fce81361e8e4a46cf8ddfdf09f10192f88907e41af5dd23fe3 not found: ID does not exist" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.935775 4846 scope.go:117] "RemoveContainer" containerID="ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a" Nov 22 09:35:13 crc kubenswrapper[4846]: E1122 09:35:13.936096 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a\": container with ID starting with ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a not found: ID does not exist" containerID="ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a" Nov 22 09:35:13 crc kubenswrapper[4846]: I1122 09:35:13.936129 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a"} err="failed to get container status \"ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a\": rpc error: code = NotFound desc = could not find container \"ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a\": container with ID starting with ffb03945dee74144415982eae3803c2c559e3e6eca3cc2aa47203b81c9015e9a not found: ID does not exist" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.011522 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-982vq\" (UniqueName: \"kubernetes.io/projected/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-kube-api-access-982vq\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.011623 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-config-data\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.011886 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.011955 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.012067 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-logs\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.052168 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061" path="/var/lib/kubelet/pods/b2b6f6dd-b5bb-4c6b-b2e0-d147c32ad061/volumes" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.118577 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-config-data\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.118755 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.118790 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.118825 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-logs\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.119090 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-982vq\" (UniqueName: \"kubernetes.io/projected/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-kube-api-access-982vq\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.119586 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-logs\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.128856 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.129010 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.134019 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-config-data\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.145412 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-982vq\" (UniqueName: \"kubernetes.io/projected/79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea-kube-api-access-982vq\") pod \"nova-metadata-0\" (UID: \"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea\") " pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.235099 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.738149 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:35:14 crc kubenswrapper[4846]: W1122 09:35:14.758302 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79fac89e_72b8_4ee5_a2b4_4a56caf2a3ea.slice/crio-9f3857ec4c4de8f72f1116db9197103ceae6ee4613847c0aa36a3689f0a027eb WatchSource:0}: Error finding container 9f3857ec4c4de8f72f1116db9197103ceae6ee4613847c0aa36a3689f0a027eb: Status 404 returned error can't find the container with id 9f3857ec4c4de8f72f1116db9197103ceae6ee4613847c0aa36a3689f0a027eb Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.785057 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea","Type":"ContainerStarted","Data":"9f3857ec4c4de8f72f1116db9197103ceae6ee4613847c0aa36a3689f0a027eb"} Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.787611 4846 generic.go:334] "Generic (PLEG): container finished" podID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerID="36802b5be6a63763b03548bdc20aa224351a9117d7dd2fb0392b7d5d0b1c2a64" exitCode=0 Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.787660 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a3e58bf-db06-4335-8193-55e1b109d0de","Type":"ContainerDied","Data":"36802b5be6a63763b03548bdc20aa224351a9117d7dd2fb0392b7d5d0b1c2a64"} Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.787908 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a3e58bf-db06-4335-8193-55e1b109d0de","Type":"ContainerDied","Data":"75b894522c586a0e275921a5b028fd0dce2370a841cdebbe2e34fc56890f328c"} Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.788000 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75b894522c586a0e275921a5b028fd0dce2370a841cdebbe2e34fc56890f328c" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.806516 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.937076 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-combined-ca-bundle\") pod \"8a3e58bf-db06-4335-8193-55e1b109d0de\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.937166 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-config-data\") pod \"8a3e58bf-db06-4335-8193-55e1b109d0de\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.937234 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-internal-tls-certs\") pod \"8a3e58bf-db06-4335-8193-55e1b109d0de\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.937405 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3e58bf-db06-4335-8193-55e1b109d0de-logs\") pod \"8a3e58bf-db06-4335-8193-55e1b109d0de\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.937432 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-public-tls-certs\") pod \"8a3e58bf-db06-4335-8193-55e1b109d0de\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.937468 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqzdh\" (UniqueName: \"kubernetes.io/projected/8a3e58bf-db06-4335-8193-55e1b109d0de-kube-api-access-hqzdh\") pod \"8a3e58bf-db06-4335-8193-55e1b109d0de\" (UID: \"8a3e58bf-db06-4335-8193-55e1b109d0de\") " Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.940224 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3e58bf-db06-4335-8193-55e1b109d0de-logs" (OuterVolumeSpecName: "logs") pod "8a3e58bf-db06-4335-8193-55e1b109d0de" (UID: "8a3e58bf-db06-4335-8193-55e1b109d0de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.946904 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3e58bf-db06-4335-8193-55e1b109d0de-kube-api-access-hqzdh" (OuterVolumeSpecName: "kube-api-access-hqzdh") pod "8a3e58bf-db06-4335-8193-55e1b109d0de" (UID: "8a3e58bf-db06-4335-8193-55e1b109d0de"). InnerVolumeSpecName "kube-api-access-hqzdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.980712 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a3e58bf-db06-4335-8193-55e1b109d0de" (UID: "8a3e58bf-db06-4335-8193-55e1b109d0de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:14 crc kubenswrapper[4846]: I1122 09:35:14.985590 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-config-data" (OuterVolumeSpecName: "config-data") pod "8a3e58bf-db06-4335-8193-55e1b109d0de" (UID: "8a3e58bf-db06-4335-8193-55e1b109d0de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.031007 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a3e58bf-db06-4335-8193-55e1b109d0de" (UID: "8a3e58bf-db06-4335-8193-55e1b109d0de"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.039718 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.039770 4846 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.039785 4846 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3e58bf-db06-4335-8193-55e1b109d0de-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.039795 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqzdh\" (UniqueName: \"kubernetes.io/projected/8a3e58bf-db06-4335-8193-55e1b109d0de-kube-api-access-hqzdh\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.039807 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.041173 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8a3e58bf-db06-4335-8193-55e1b109d0de" (UID: "8a3e58bf-db06-4335-8193-55e1b109d0de"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.142741 4846 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3e58bf-db06-4335-8193-55e1b109d0de-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.805964 4846 generic.go:334] "Generic (PLEG): container finished" podID="485d8018-77aa-40ce-8978-8126c84202ac" containerID="6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656" exitCode=0 Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.806069 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485d8018-77aa-40ce-8978-8126c84202ac","Type":"ContainerDied","Data":"6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656"} Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.809009 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea","Type":"ContainerStarted","Data":"87c1696b355fa55fcd1f8960804332067433db04d11613a195c5231d0231ccea"} Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.809112 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.809122 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea","Type":"ContainerStarted","Data":"8442bbfa5c560a5015ba743a6ac01929c195e59be83f69c7caea8fa9d92b70c0"} Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.841454 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8413725530000002 podStartE2EDuration="2.841372553s" podCreationTimestamp="2025-11-22 09:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:35:15.834161324 +0000 UTC m=+1290.769850993" watchObservedRunningTime="2025-11-22 09:35:15.841372553 +0000 UTC m=+1290.777062212" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.874355 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.896704 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.915089 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:35:15 crc kubenswrapper[4846]: E1122 09:35:15.915673 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-api" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.915700 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-api" Nov 22 09:35:15 crc kubenswrapper[4846]: E1122 09:35:15.915721 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-log" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.915733 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-log" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.915997 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-log" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.916060 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" containerName="nova-api-api" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.917603 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.921751 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.926899 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.929763 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.932023 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:35:15 crc kubenswrapper[4846]: I1122 09:35:15.979920 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.053641 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3e58bf-db06-4335-8193-55e1b109d0de" path="/var/lib/kubelet/pods/8a3e58bf-db06-4335-8193-55e1b109d0de/volumes" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.069524 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-config-data\") pod \"485d8018-77aa-40ce-8978-8126c84202ac\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.069738 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-combined-ca-bundle\") pod \"485d8018-77aa-40ce-8978-8126c84202ac\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.070380 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqlvr\" (UniqueName: \"kubernetes.io/projected/485d8018-77aa-40ce-8978-8126c84202ac-kube-api-access-nqlvr\") pod \"485d8018-77aa-40ce-8978-8126c84202ac\" (UID: \"485d8018-77aa-40ce-8978-8126c84202ac\") " Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.070774 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-internal-tls-certs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.070861 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-public-tls-certs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.070929 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.070972 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-config-data\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.071055 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb429\" (UniqueName: \"kubernetes.io/projected/732fe70d-07f5-455f-b20a-5a4d0c92c764-kube-api-access-pb429\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.071125 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732fe70d-07f5-455f-b20a-5a4d0c92c764-logs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.078630 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485d8018-77aa-40ce-8978-8126c84202ac-kube-api-access-nqlvr" (OuterVolumeSpecName: "kube-api-access-nqlvr") pod "485d8018-77aa-40ce-8978-8126c84202ac" (UID: "485d8018-77aa-40ce-8978-8126c84202ac"). InnerVolumeSpecName "kube-api-access-nqlvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.107287 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-config-data" (OuterVolumeSpecName: "config-data") pod "485d8018-77aa-40ce-8978-8126c84202ac" (UID: "485d8018-77aa-40ce-8978-8126c84202ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.129334 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485d8018-77aa-40ce-8978-8126c84202ac" (UID: "485d8018-77aa-40ce-8978-8126c84202ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173011 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732fe70d-07f5-455f-b20a-5a4d0c92c764-logs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173163 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-internal-tls-certs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173210 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-public-tls-certs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173257 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173323 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-config-data\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173388 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb429\" (UniqueName: \"kubernetes.io/projected/732fe70d-07f5-455f-b20a-5a4d0c92c764-kube-api-access-pb429\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173478 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqlvr\" (UniqueName: \"kubernetes.io/projected/485d8018-77aa-40ce-8978-8126c84202ac-kube-api-access-nqlvr\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173495 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.173511 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485d8018-77aa-40ce-8978-8126c84202ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.174506 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/732fe70d-07f5-455f-b20a-5a4d0c92c764-logs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.179735 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.182439 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-config-data\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.182454 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-internal-tls-certs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.183152 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fe70d-07f5-455f-b20a-5a4d0c92c764-public-tls-certs\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.192332 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb429\" (UniqueName: \"kubernetes.io/projected/732fe70d-07f5-455f-b20a-5a4d0c92c764-kube-api-access-pb429\") pod \"nova-api-0\" (UID: \"732fe70d-07f5-455f-b20a-5a4d0c92c764\") " pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.340529 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.851392 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.857634 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485d8018-77aa-40ce-8978-8126c84202ac","Type":"ContainerDied","Data":"ec4f714d6a8083e950bc7e61a4e49148b4013beb78e691c737f3564e48480221"} Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.857702 4846 scope.go:117] "RemoveContainer" containerID="6fc634df0b706be58438afbf2725cee08688d7a2b6044afb9c45fee5dac97656" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.858284 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.960570 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.976169 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.988670 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:35:16 crc kubenswrapper[4846]: E1122 09:35:16.989322 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485d8018-77aa-40ce-8978-8126c84202ac" containerName="nova-scheduler-scheduler" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.989341 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="485d8018-77aa-40ce-8978-8126c84202ac" containerName="nova-scheduler-scheduler" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.989646 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="485d8018-77aa-40ce-8978-8126c84202ac" containerName="nova-scheduler-scheduler" Nov 22 09:35:16 crc kubenswrapper[4846]: I1122 09:35:16.990719 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:16.998611 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.005207 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:35:17 crc kubenswrapper[4846]: E1122 09:35:17.056724 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485d8018_77aa_40ce_8978_8126c84202ac.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485d8018_77aa_40ce_8978_8126c84202ac.slice/crio-ec4f714d6a8083e950bc7e61a4e49148b4013beb78e691c737f3564e48480221\": RecentStats: unable to find data in memory cache]" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.094070 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177c2c54-1d5d-409c-8592-141b25fc59cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.094740 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2p2n\" (UniqueName: \"kubernetes.io/projected/177c2c54-1d5d-409c-8592-141b25fc59cc-kube-api-access-h2p2n\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.094910 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177c2c54-1d5d-409c-8592-141b25fc59cc-config-data\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.197493 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177c2c54-1d5d-409c-8592-141b25fc59cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.197754 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2p2n\" (UniqueName: \"kubernetes.io/projected/177c2c54-1d5d-409c-8592-141b25fc59cc-kube-api-access-h2p2n\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.197799 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177c2c54-1d5d-409c-8592-141b25fc59cc-config-data\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.202905 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177c2c54-1d5d-409c-8592-141b25fc59cc-config-data\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.203659 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177c2c54-1d5d-409c-8592-141b25fc59cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.230351 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2p2n\" (UniqueName: \"kubernetes.io/projected/177c2c54-1d5d-409c-8592-141b25fc59cc-kube-api-access-h2p2n\") pod \"nova-scheduler-0\" (UID: \"177c2c54-1d5d-409c-8592-141b25fc59cc\") " pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.321345 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.860793 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:35:17 crc kubenswrapper[4846]: W1122 09:35:17.864004 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod177c2c54_1d5d_409c_8592_141b25fc59cc.slice/crio-de354729274382c0d8073e8e313238ff3ddc41630ab230eff2834622373b2e6d WatchSource:0}: Error finding container de354729274382c0d8073e8e313238ff3ddc41630ab230eff2834622373b2e6d: Status 404 returned error can't find the container with id de354729274382c0d8073e8e313238ff3ddc41630ab230eff2834622373b2e6d Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.878492 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"732fe70d-07f5-455f-b20a-5a4d0c92c764","Type":"ContainerStarted","Data":"d72723d359955f9f03796293037b4329d2455e033ffe865f961752e6a1f4ea28"} Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.878698 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"732fe70d-07f5-455f-b20a-5a4d0c92c764","Type":"ContainerStarted","Data":"ba4a8b9e992864ad1901a007319c20d448e01caddbdd55d22d3e018457492dda"} Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.878769 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"732fe70d-07f5-455f-b20a-5a4d0c92c764","Type":"ContainerStarted","Data":"b6a414023d3d0e3c853430585bdbf425eb6479ec5506ebad88c3c5cfaa27f5a1"} Nov 22 09:35:17 crc kubenswrapper[4846]: I1122 09:35:17.918184 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.918136577 podStartE2EDuration="2.918136577s" podCreationTimestamp="2025-11-22 09:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:35:17.903035578 +0000 UTC m=+1292.838725247" watchObservedRunningTime="2025-11-22 09:35:17.918136577 +0000 UTC m=+1292.853826276" Nov 22 09:35:18 crc kubenswrapper[4846]: I1122 09:35:18.051081 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485d8018-77aa-40ce-8978-8126c84202ac" path="/var/lib/kubelet/pods/485d8018-77aa-40ce-8978-8126c84202ac/volumes" Nov 22 09:35:18 crc kubenswrapper[4846]: I1122 09:35:18.898127 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"177c2c54-1d5d-409c-8592-141b25fc59cc","Type":"ContainerStarted","Data":"5b315a0793612d552d3bd8ffca869e750a80f8ba5ee4ff3bd2282e8251d4c0fd"} Nov 22 09:35:18 crc kubenswrapper[4846]: I1122 09:35:18.898881 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"177c2c54-1d5d-409c-8592-141b25fc59cc","Type":"ContainerStarted","Data":"de354729274382c0d8073e8e313238ff3ddc41630ab230eff2834622373b2e6d"} Nov 22 09:35:18 crc kubenswrapper[4846]: I1122 09:35:18.927075 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.927038563 podStartE2EDuration="2.927038563s" podCreationTimestamp="2025-11-22 09:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:35:18.920852523 +0000 UTC m=+1293.856542182" watchObservedRunningTime="2025-11-22 09:35:18.927038563 +0000 UTC m=+1293.862728222" Nov 22 09:35:19 crc kubenswrapper[4846]: I1122 09:35:19.235519 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:35:19 crc kubenswrapper[4846]: I1122 09:35:19.235592 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:35:22 crc kubenswrapper[4846]: I1122 09:35:22.322527 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:35:24 crc kubenswrapper[4846]: I1122 09:35:24.235331 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:35:24 crc kubenswrapper[4846]: I1122 09:35:24.235842 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:35:25 crc kubenswrapper[4846]: I1122 09:35:25.253367 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:35:25 crc kubenswrapper[4846]: I1122 09:35:25.253417 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:35:26 crc kubenswrapper[4846]: I1122 09:35:26.341323 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:35:26 crc kubenswrapper[4846]: I1122 09:35:26.341445 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:35:27 crc kubenswrapper[4846]: I1122 09:35:27.176736 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 09:35:27 crc kubenswrapper[4846]: I1122 09:35:27.321685 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 09:35:27 crc kubenswrapper[4846]: I1122 09:35:27.356251 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="732fe70d-07f5-455f-b20a-5a4d0c92c764" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:35:27 crc kubenswrapper[4846]: I1122 09:35:27.356258 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="732fe70d-07f5-455f-b20a-5a4d0c92c764" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 09:35:27 crc kubenswrapper[4846]: I1122 09:35:27.367960 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 09:35:28 crc kubenswrapper[4846]: I1122 09:35:28.100510 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 09:35:34 crc kubenswrapper[4846]: I1122 09:35:34.242537 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:35:34 crc kubenswrapper[4846]: I1122 09:35:34.243791 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:35:34 crc kubenswrapper[4846]: I1122 09:35:34.252150 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:35:34 crc kubenswrapper[4846]: I1122 09:35:34.252970 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:35:36 crc kubenswrapper[4846]: I1122 09:35:36.353903 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:35:36 crc kubenswrapper[4846]: I1122 09:35:36.354565 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:35:36 crc kubenswrapper[4846]: I1122 09:35:36.355105 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:35:36 crc kubenswrapper[4846]: I1122 09:35:36.355165 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:35:36 crc kubenswrapper[4846]: I1122 09:35:36.363078 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:35:36 crc kubenswrapper[4846]: I1122 09:35:36.373652 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:35:45 crc kubenswrapper[4846]: I1122 09:35:45.641761 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:35:46 crc kubenswrapper[4846]: I1122 09:35:46.726600 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:35:50 crc kubenswrapper[4846]: I1122 09:35:50.623877 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="rabbitmq" containerID="cri-o://81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad" gracePeriod=604796 Nov 22 09:35:50 crc kubenswrapper[4846]: I1122 09:35:50.711531 4846 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 22 09:35:51 crc kubenswrapper[4846]: I1122 09:35:51.849304 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerName="rabbitmq" containerID="cri-o://1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d" gracePeriod=604795 Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.353095 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.453825 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2wpk\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-kube-api-access-n2wpk\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.453887 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-plugins\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.453930 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-erlang-cookie\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.453999 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-server-conf\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454125 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454295 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-tls\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454350 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-config-data\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454385 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-confd\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454490 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/899cf49d-9541-4f23-b1a2-887324973fb1-erlang-cookie-secret\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454541 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454565 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-plugins-conf\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.454640 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/899cf49d-9541-4f23-b1a2-887324973fb1-pod-info\") pod \"899cf49d-9541-4f23-b1a2-887324973fb1\" (UID: \"899cf49d-9541-4f23-b1a2-887324973fb1\") " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.455351 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.458290 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.459976 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.465605 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/899cf49d-9541-4f23-b1a2-887324973fb1-pod-info" (OuterVolumeSpecName: "pod-info") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.474215 4846 generic.go:334] "Generic (PLEG): container finished" podID="899cf49d-9541-4f23-b1a2-887324973fb1" containerID="81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad" exitCode=0 Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.474594 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"899cf49d-9541-4f23-b1a2-887324973fb1","Type":"ContainerDied","Data":"81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad"} Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.474641 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"899cf49d-9541-4f23-b1a2-887324973fb1","Type":"ContainerDied","Data":"e01b8f720907754fd524f2a9a8c3511a96698a99fb85d8aa447eda44caf60122"} Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.474663 4846 scope.go:117] "RemoveContainer" containerID="81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.474910 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.482243 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-kube-api-access-n2wpk" (OuterVolumeSpecName: "kube-api-access-n2wpk") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "kube-api-access-n2wpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.484163 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.490803 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899cf49d-9541-4f23-b1a2-887324973fb1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.522334 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558150 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2wpk\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-kube-api-access-n2wpk\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558196 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558232 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558246 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558259 4846 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/899cf49d-9541-4f23-b1a2-887324973fb1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558271 4846 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.558282 4846 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/899cf49d-9541-4f23-b1a2-887324973fb1-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.583538 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-config-data" (OuterVolumeSpecName: "config-data") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.612148 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-server-conf" (OuterVolumeSpecName: "server-conf") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.620396 4846 scope.go:117] "RemoveContainer" containerID="56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.656647 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "899cf49d-9541-4f23-b1a2-887324973fb1" (UID: "899cf49d-9541-4f23-b1a2-887324973fb1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.662507 4846 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.662546 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/899cf49d-9541-4f23-b1a2-887324973fb1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.662560 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/899cf49d-9541-4f23-b1a2-887324973fb1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.677478 4846 scope.go:117] "RemoveContainer" containerID="81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad" Nov 22 09:35:57 crc kubenswrapper[4846]: E1122 09:35:57.683334 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad\": container with ID starting with 81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad not found: ID does not exist" containerID="81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.683387 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad"} err="failed to get container status \"81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad\": rpc error: code = NotFound desc = could not find container \"81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad\": container with ID starting with 81ce0b2176e067457e25f77b880d2f5b5bcb4b8f3173bc77e2126b59e571c2ad not found: ID does not exist" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.683425 4846 scope.go:117] "RemoveContainer" containerID="56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.685727 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 22 09:35:57 crc kubenswrapper[4846]: E1122 09:35:57.688487 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37\": container with ID starting with 56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37 not found: ID does not exist" containerID="56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.688525 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37"} err="failed to get container status \"56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37\": rpc error: code = NotFound desc = could not find container \"56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37\": container with ID starting with 56507c7e6170f0c5f53dcd157013069eca4d2975da8075b41183044bdb153a37 not found: ID does not exist" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.764590 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.819154 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.833290 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.849391 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:35:57 crc kubenswrapper[4846]: E1122 09:35:57.849906 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="rabbitmq" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.849927 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="rabbitmq" Nov 22 09:35:57 crc kubenswrapper[4846]: E1122 09:35:57.849939 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="setup-container" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.849948 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="setup-container" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.850170 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" containerName="rabbitmq" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.856056 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.859740 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.859916 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.859944 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.859923 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.860305 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.861737 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.862012 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ddrh2" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.886180 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.969264 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.969423 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9dbf\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-kube-api-access-m9dbf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.969611 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.969844 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b44e9aa-f202-48be-bace-279f29824c1b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970069 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970117 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970397 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970509 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b44e9aa-f202-48be-bace-279f29824c1b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970657 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970696 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:57 crc kubenswrapper[4846]: I1122 09:35:57.970906 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.051855 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899cf49d-9541-4f23-b1a2-887324973fb1" path="/var/lib/kubelet/pods/899cf49d-9541-4f23-b1a2-887324973fb1/volumes" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074197 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9dbf\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-kube-api-access-m9dbf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074293 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074364 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b44e9aa-f202-48be-bace-279f29824c1b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074421 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074449 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074516 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074555 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b44e9aa-f202-48be-bace-279f29824c1b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074590 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074634 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074678 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.074721 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.075655 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.076072 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.076091 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.076217 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.076573 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.077398 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b44e9aa-f202-48be-bace-279f29824c1b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.083430 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b44e9aa-f202-48be-bace-279f29824c1b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.083918 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.084885 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b44e9aa-f202-48be-bace-279f29824c1b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.089144 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.110540 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9dbf\" (UniqueName: \"kubernetes.io/projected/5b44e9aa-f202-48be-bace-279f29824c1b-kube-api-access-m9dbf\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.166624 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5b44e9aa-f202-48be-bace-279f29824c1b\") " pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.186084 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.429883 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.497123 4846 generic.go:334] "Generic (PLEG): container finished" podID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerID="1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d" exitCode=0 Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.497234 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6","Type":"ContainerDied","Data":"1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d"} Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.497729 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6","Type":"ContainerDied","Data":"c24757bc17482d1dfbed934a5590a4bddb270beb159f25100908faab857febfc"} Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.497297 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.499007 4846 scope.go:117] "RemoveContainer" containerID="1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.540649 4846 scope.go:117] "RemoveContainer" containerID="fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587378 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-plugins\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587474 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-tls\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587522 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-plugins-conf\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587622 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-pod-info\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587685 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-confd\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587779 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587891 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgkdt\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-kube-api-access-pgkdt\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587942 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-erlang-cookie\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.587990 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-config-data\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.588019 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-erlang-cookie-secret\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.588077 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-server-conf\") pod \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\" (UID: \"98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6\") " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.589414 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.589630 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.591189 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.591769 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.594013 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.595253 4846 scope.go:117] "RemoveContainer" containerID="1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.595258 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-pod-info" (OuterVolumeSpecName: "pod-info") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.595517 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: E1122 09:35:58.595868 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d\": container with ID starting with 1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d not found: ID does not exist" containerID="1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.595907 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d"} err="failed to get container status \"1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d\": rpc error: code = NotFound desc = could not find container \"1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d\": container with ID starting with 1f1ef09193bc01c15c8f3dee9357997a58f399c913a0bbcbef81d32d7579495d not found: ID does not exist" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.595935 4846 scope.go:117] "RemoveContainer" containerID="fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7" Nov 22 09:35:58 crc kubenswrapper[4846]: E1122 09:35:58.596155 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7\": container with ID starting with fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7 not found: ID does not exist" containerID="fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.596178 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7"} err="failed to get container status \"fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7\": rpc error: code = NotFound desc = could not find container \"fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7\": container with ID starting with fc593390b7cc9d2666d35b4ec7ba63f4fcd60f7b3af26b34f9d878abb9e037b7 not found: ID does not exist" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.596713 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.601772 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-kube-api-access-pgkdt" (OuterVolumeSpecName: "kube-api-access-pgkdt") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "kube-api-access-pgkdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.621840 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-config-data" (OuterVolumeSpecName: "config-data") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.627215 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.627280 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.654830 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-server-conf" (OuterVolumeSpecName: "server-conf") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693468 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693503 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693514 4846 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693526 4846 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693559 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693571 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgkdt\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-kube-api-access-pgkdt\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693581 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693591 4846 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.693600 4846 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.705681 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" (UID: "98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.732654 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.738398 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.795388 4846 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.795427 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.870665 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.886155 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.902562 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:35:58 crc kubenswrapper[4846]: E1122 09:35:58.903109 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerName="setup-container" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.903552 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerName="setup-container" Nov 22 09:35:58 crc kubenswrapper[4846]: E1122 09:35:58.903592 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerName="rabbitmq" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.903600 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerName="rabbitmq" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.903812 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" containerName="rabbitmq" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.905266 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.908632 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.908866 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.909115 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.909251 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.909369 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nl27q" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.909485 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.910693 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 09:35:58 crc kubenswrapper[4846]: I1122 09:35:58.933300 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.104335 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/812351d5-d992-4243-94c9-3328217b37b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.104435 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.104497 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.104556 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.104884 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.105080 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.105134 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.105315 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpl56\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-kube-api-access-wpl56\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.105365 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.105434 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.105545 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/812351d5-d992-4243-94c9-3328217b37b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207263 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207311 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207387 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpl56\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-kube-api-access-wpl56\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207418 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207449 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207487 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/812351d5-d992-4243-94c9-3328217b37b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207520 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/812351d5-d992-4243-94c9-3328217b37b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207549 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207570 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207596 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207633 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.207776 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.208653 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.208972 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.209078 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.209224 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/812351d5-d992-4243-94c9-3328217b37b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.209301 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.214813 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/812351d5-d992-4243-94c9-3328217b37b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.215014 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/812351d5-d992-4243-94c9-3328217b37b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.215798 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.217931 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.236957 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpl56\" (UniqueName: \"kubernetes.io/projected/812351d5-d992-4243-94c9-3328217b37b9-kube-api-access-wpl56\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.248645 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"812351d5-d992-4243-94c9-3328217b37b9\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.289563 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.543405 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b44e9aa-f202-48be-bace-279f29824c1b","Type":"ContainerStarted","Data":"9fcffe34370ad65f134f6e875c69c623cea17062a5966ec2d3a18fb9ed02302b"} Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.554166 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-484sd"] Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.556499 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.560310 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.573063 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-484sd"] Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719271 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719322 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-config\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719359 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719516 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7gh\" (UniqueName: \"kubernetes.io/projected/0194f2cd-5396-4f6c-beae-ad240efc9dd1-kube-api-access-4r7gh\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719770 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719805 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.719892 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.807244 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.822871 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.822934 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-config\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.822979 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.823031 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7gh\" (UniqueName: \"kubernetes.io/projected/0194f2cd-5396-4f6c-beae-ad240efc9dd1-kube-api-access-4r7gh\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.823132 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.823155 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.823187 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.824616 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.824626 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.824898 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.824957 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.825304 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.825355 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-config\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.879882 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7gh\" (UniqueName: \"kubernetes.io/projected/0194f2cd-5396-4f6c-beae-ad240efc9dd1-kube-api-access-4r7gh\") pod \"dnsmasq-dns-79bd4cc8c9-484sd\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:35:59 crc kubenswrapper[4846]: W1122 09:35:59.883072 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod812351d5_d992_4243_94c9_3328217b37b9.slice/crio-f103e4ff5084415a93d97a39afd53541d3209bc8dc7b77da06a1f511959d8c54 WatchSource:0}: Error finding container f103e4ff5084415a93d97a39afd53541d3209bc8dc7b77da06a1f511959d8c54: Status 404 returned error can't find the container with id f103e4ff5084415a93d97a39afd53541d3209bc8dc7b77da06a1f511959d8c54 Nov 22 09:35:59 crc kubenswrapper[4846]: I1122 09:35:59.901849 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:36:00 crc kubenswrapper[4846]: I1122 09:36:00.078718 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6" path="/var/lib/kubelet/pods/98fa314c-0c1f-4dbc-86e0-1f29fd0b52c6/volumes" Nov 22 09:36:00 crc kubenswrapper[4846]: I1122 09:36:00.436821 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-484sd"] Nov 22 09:36:00 crc kubenswrapper[4846]: W1122 09:36:00.448672 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0194f2cd_5396_4f6c_beae_ad240efc9dd1.slice/crio-ebda192c11c255f6d005d1c3af93e9d0ea37ded7f4d0085cc9794f94a9e40d77 WatchSource:0}: Error finding container ebda192c11c255f6d005d1c3af93e9d0ea37ded7f4d0085cc9794f94a9e40d77: Status 404 returned error can't find the container with id ebda192c11c255f6d005d1c3af93e9d0ea37ded7f4d0085cc9794f94a9e40d77 Nov 22 09:36:00 crc kubenswrapper[4846]: I1122 09:36:00.563677 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" event={"ID":"0194f2cd-5396-4f6c-beae-ad240efc9dd1","Type":"ContainerStarted","Data":"ebda192c11c255f6d005d1c3af93e9d0ea37ded7f4d0085cc9794f94a9e40d77"} Nov 22 09:36:00 crc kubenswrapper[4846]: I1122 09:36:00.565912 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"812351d5-d992-4243-94c9-3328217b37b9","Type":"ContainerStarted","Data":"f103e4ff5084415a93d97a39afd53541d3209bc8dc7b77da06a1f511959d8c54"} Nov 22 09:36:01 crc kubenswrapper[4846]: I1122 09:36:01.582931 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b44e9aa-f202-48be-bace-279f29824c1b","Type":"ContainerStarted","Data":"e1052ddb5ea71a1adf42c7cdb2b5c40fe072033bf302012629987ed43f449a13"} Nov 22 09:36:01 crc kubenswrapper[4846]: I1122 09:36:01.585145 4846 generic.go:334] "Generic (PLEG): container finished" podID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerID="474b5031ecb8714783ac13b530657bf7561faf0e60200e2cbc26eb8b51f014a3" exitCode=0 Nov 22 09:36:01 crc kubenswrapper[4846]: I1122 09:36:01.585219 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" event={"ID":"0194f2cd-5396-4f6c-beae-ad240efc9dd1","Type":"ContainerDied","Data":"474b5031ecb8714783ac13b530657bf7561faf0e60200e2cbc26eb8b51f014a3"} Nov 22 09:36:02 crc kubenswrapper[4846]: I1122 09:36:02.603400 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"812351d5-d992-4243-94c9-3328217b37b9","Type":"ContainerStarted","Data":"84beb3cadfa8922aed80b0bfae56bae3053b97cb2228377ce15980dc806cccaf"} Nov 22 09:36:02 crc kubenswrapper[4846]: I1122 09:36:02.606891 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" event={"ID":"0194f2cd-5396-4f6c-beae-ad240efc9dd1","Type":"ContainerStarted","Data":"d7834215d0482ac4b2c74c6f4b44f7c9aab8d1a8dccc9808fdf6da56cecd34a5"} Nov 22 09:36:02 crc kubenswrapper[4846]: I1122 09:36:02.607234 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:36:02 crc kubenswrapper[4846]: I1122 09:36:02.674115 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" podStartSLOduration=3.674083645 podStartE2EDuration="3.674083645s" podCreationTimestamp="2025-11-22 09:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:36:02.665873997 +0000 UTC m=+1337.601563646" watchObservedRunningTime="2025-11-22 09:36:02.674083645 +0000 UTC m=+1337.609773304" Nov 22 09:36:09 crc kubenswrapper[4846]: I1122 09:36:09.904758 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.066097 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-wbtbs"] Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.066491 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" containerName="dnsmasq-dns" containerID="cri-o://f249fe84a9c2f30e2d5ecec9b5b3cbeb411fba19b8045fe57eec7f3aaad3dcbd" gracePeriod=10 Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.292874 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-b22zv"] Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.301978 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.326958 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-b22zv"] Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426119 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9s9x\" (UniqueName: \"kubernetes.io/projected/fb7382e7-13c7-4cf5-9462-b58b330e0315-kube-api-access-g9s9x\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426208 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426273 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426297 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-dns-svc\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426362 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426414 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.426446 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-config\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528294 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528380 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528409 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-config\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528483 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9s9x\" (UniqueName: \"kubernetes.io/projected/fb7382e7-13c7-4cf5-9462-b58b330e0315-kube-api-access-g9s9x\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528538 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528568 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.528585 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-dns-svc\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.529388 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.529450 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-dns-svc\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.530008 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.530331 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.530609 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.530849 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb7382e7-13c7-4cf5-9462-b58b330e0315-config\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.556633 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9s9x\" (UniqueName: \"kubernetes.io/projected/fb7382e7-13c7-4cf5-9462-b58b330e0315-kube-api-access-g9s9x\") pod \"dnsmasq-dns-55478c4467-b22zv\" (UID: \"fb7382e7-13c7-4cf5-9462-b58b330e0315\") " pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.621241 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.742705 4846 generic.go:334] "Generic (PLEG): container finished" podID="defce30c-d2ab-4153-91de-c76acd4c3529" containerID="f249fe84a9c2f30e2d5ecec9b5b3cbeb411fba19b8045fe57eec7f3aaad3dcbd" exitCode=0 Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.743105 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" event={"ID":"defce30c-d2ab-4153-91de-c76acd4c3529","Type":"ContainerDied","Data":"f249fe84a9c2f30e2d5ecec9b5b3cbeb411fba19b8045fe57eec7f3aaad3dcbd"} Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.743140 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" event={"ID":"defce30c-d2ab-4153-91de-c76acd4c3529","Type":"ContainerDied","Data":"971caf9fdc7eded19f8ae2abcae13ef9620b78f14c9b5014fcf18dbed09df5d8"} Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.743153 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="971caf9fdc7eded19f8ae2abcae13ef9620b78f14c9b5014fcf18dbed09df5d8" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.760845 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.833923 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-nb\") pod \"defce30c-d2ab-4153-91de-c76acd4c3529\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.833988 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-swift-storage-0\") pod \"defce30c-d2ab-4153-91de-c76acd4c3529\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.834061 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-config\") pod \"defce30c-d2ab-4153-91de-c76acd4c3529\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.834240 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpnnc\" (UniqueName: \"kubernetes.io/projected/defce30c-d2ab-4153-91de-c76acd4c3529-kube-api-access-hpnnc\") pod \"defce30c-d2ab-4153-91de-c76acd4c3529\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.834265 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-svc\") pod \"defce30c-d2ab-4153-91de-c76acd4c3529\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.834473 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-sb\") pod \"defce30c-d2ab-4153-91de-c76acd4c3529\" (UID: \"defce30c-d2ab-4153-91de-c76acd4c3529\") " Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.842335 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defce30c-d2ab-4153-91de-c76acd4c3529-kube-api-access-hpnnc" (OuterVolumeSpecName: "kube-api-access-hpnnc") pod "defce30c-d2ab-4153-91de-c76acd4c3529" (UID: "defce30c-d2ab-4153-91de-c76acd4c3529"). InnerVolumeSpecName "kube-api-access-hpnnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.904958 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "defce30c-d2ab-4153-91de-c76acd4c3529" (UID: "defce30c-d2ab-4153-91de-c76acd4c3529"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.911419 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "defce30c-d2ab-4153-91de-c76acd4c3529" (UID: "defce30c-d2ab-4153-91de-c76acd4c3529"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.920897 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "defce30c-d2ab-4153-91de-c76acd4c3529" (UID: "defce30c-d2ab-4153-91de-c76acd4c3529"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.928966 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-config" (OuterVolumeSpecName: "config") pod "defce30c-d2ab-4153-91de-c76acd4c3529" (UID: "defce30c-d2ab-4153-91de-c76acd4c3529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.937487 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpnnc\" (UniqueName: \"kubernetes.io/projected/defce30c-d2ab-4153-91de-c76acd4c3529-kube-api-access-hpnnc\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.937525 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.937537 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.937545 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.937555 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:10 crc kubenswrapper[4846]: I1122 09:36:10.942240 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "defce30c-d2ab-4153-91de-c76acd4c3529" (UID: "defce30c-d2ab-4153-91de-c76acd4c3529"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.039905 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/defce30c-d2ab-4153-91de-c76acd4c3529-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.134342 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-b22zv"] Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.759307 4846 generic.go:334] "Generic (PLEG): container finished" podID="fb7382e7-13c7-4cf5-9462-b58b330e0315" containerID="53b9c0d7e8f1b438d825035c3be0a9e46c2f06b37cd14de8db294b5f271b1c74" exitCode=0 Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.759417 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-wbtbs" Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.759888 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-b22zv" event={"ID":"fb7382e7-13c7-4cf5-9462-b58b330e0315","Type":"ContainerDied","Data":"53b9c0d7e8f1b438d825035c3be0a9e46c2f06b37cd14de8db294b5f271b1c74"} Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.760208 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-b22zv" event={"ID":"fb7382e7-13c7-4cf5-9462-b58b330e0315","Type":"ContainerStarted","Data":"eb2717701ebc79103065b74fd26a06488ea75b5bc4933d5cc43a895f48689f87"} Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.895709 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-wbtbs"] Nov 22 09:36:11 crc kubenswrapper[4846]: I1122 09:36:11.906209 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-wbtbs"] Nov 22 09:36:12 crc kubenswrapper[4846]: I1122 09:36:12.048621 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" path="/var/lib/kubelet/pods/defce30c-d2ab-4153-91de-c76acd4c3529/volumes" Nov 22 09:36:12 crc kubenswrapper[4846]: I1122 09:36:12.772414 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-b22zv" event={"ID":"fb7382e7-13c7-4cf5-9462-b58b330e0315","Type":"ContainerStarted","Data":"1df069e7301822cdb4255715a1effaa145dae634fe9ca699536dede44a9f96aa"} Nov 22 09:36:12 crc kubenswrapper[4846]: I1122 09:36:12.774210 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:12 crc kubenswrapper[4846]: I1122 09:36:12.811389 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-b22zv" podStartSLOduration=2.811366583 podStartE2EDuration="2.811366583s" podCreationTimestamp="2025-11-22 09:36:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:36:12.801778865 +0000 UTC m=+1347.737468534" watchObservedRunningTime="2025-11-22 09:36:12.811366583 +0000 UTC m=+1347.747056242" Nov 22 09:36:20 crc kubenswrapper[4846]: I1122 09:36:20.624463 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-b22zv" Nov 22 09:36:20 crc kubenswrapper[4846]: I1122 09:36:20.727752 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-484sd"] Nov 22 09:36:20 crc kubenswrapper[4846]: I1122 09:36:20.728173 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerName="dnsmasq-dns" containerID="cri-o://d7834215d0482ac4b2c74c6f4b44f7c9aab8d1a8dccc9808fdf6da56cecd34a5" gracePeriod=10 Nov 22 09:36:20 crc kubenswrapper[4846]: I1122 09:36:20.893164 4846 generic.go:334] "Generic (PLEG): container finished" podID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerID="d7834215d0482ac4b2c74c6f4b44f7c9aab8d1a8dccc9808fdf6da56cecd34a5" exitCode=0 Nov 22 09:36:20 crc kubenswrapper[4846]: I1122 09:36:20.893357 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" event={"ID":"0194f2cd-5396-4f6c-beae-ad240efc9dd1","Type":"ContainerDied","Data":"d7834215d0482ac4b2c74c6f4b44f7c9aab8d1a8dccc9808fdf6da56cecd34a5"} Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.827811 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.917525 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" event={"ID":"0194f2cd-5396-4f6c-beae-ad240efc9dd1","Type":"ContainerDied","Data":"ebda192c11c255f6d005d1c3af93e9d0ea37ded7f4d0085cc9794f94a9e40d77"} Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.917600 4846 scope.go:117] "RemoveContainer" containerID="d7834215d0482ac4b2c74c6f4b44f7c9aab8d1a8dccc9808fdf6da56cecd34a5" Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.917756 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-484sd" Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.942824 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-nb\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.942930 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r7gh\" (UniqueName: \"kubernetes.io/projected/0194f2cd-5396-4f6c-beae-ad240efc9dd1-kube-api-access-4r7gh\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.943150 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-openstack-edpm-ipam\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.943204 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-sb\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.943321 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-svc\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.943401 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-config\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.943442 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-swift-storage-0\") pod \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\" (UID: \"0194f2cd-5396-4f6c-beae-ad240efc9dd1\") " Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.944795 4846 scope.go:117] "RemoveContainer" containerID="474b5031ecb8714783ac13b530657bf7561faf0e60200e2cbc26eb8b51f014a3" Nov 22 09:36:21 crc kubenswrapper[4846]: I1122 09:36:21.971677 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0194f2cd-5396-4f6c-beae-ad240efc9dd1-kube-api-access-4r7gh" (OuterVolumeSpecName: "kube-api-access-4r7gh") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "kube-api-access-4r7gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.003224 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.009968 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-config" (OuterVolumeSpecName: "config") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.014375 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.023415 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.037188 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.037203 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0194f2cd-5396-4f6c-beae-ad240efc9dd1" (UID: "0194f2cd-5396-4f6c-beae-ad240efc9dd1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045735 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045771 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r7gh\" (UniqueName: \"kubernetes.io/projected/0194f2cd-5396-4f6c-beae-ad240efc9dd1-kube-api-access-4r7gh\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045788 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045798 4846 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045810 4846 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045819 4846 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.045829 4846 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0194f2cd-5396-4f6c-beae-ad240efc9dd1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.243549 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-484sd"] Nov 22 09:36:22 crc kubenswrapper[4846]: I1122 09:36:22.251850 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-484sd"] Nov 22 09:36:24 crc kubenswrapper[4846]: I1122 09:36:24.056941 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" path="/var/lib/kubelet/pods/0194f2cd-5396-4f6c-beae-ad240efc9dd1/volumes" Nov 22 09:36:28 crc kubenswrapper[4846]: I1122 09:36:28.625422 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:36:28 crc kubenswrapper[4846]: I1122 09:36:28.626342 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.889322 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn"] Nov 22 09:36:33 crc kubenswrapper[4846]: E1122 09:36:33.890367 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" containerName="dnsmasq-dns" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.890380 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" containerName="dnsmasq-dns" Nov 22 09:36:33 crc kubenswrapper[4846]: E1122 09:36:33.890393 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerName="dnsmasq-dns" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.890399 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerName="dnsmasq-dns" Nov 22 09:36:33 crc kubenswrapper[4846]: E1122 09:36:33.890429 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerName="init" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.890437 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerName="init" Nov 22 09:36:33 crc kubenswrapper[4846]: E1122 09:36:33.890462 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" containerName="init" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.890467 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" containerName="init" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.890639 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="defce30c-d2ab-4153-91de-c76acd4c3529" containerName="dnsmasq-dns" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.890652 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0194f2cd-5396-4f6c-beae-ad240efc9dd1" containerName="dnsmasq-dns" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.891324 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.894942 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.895137 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.895644 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.896401 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.909836 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn"] Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.952671 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.952751 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.952826 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9sv\" (UniqueName: \"kubernetes.io/projected/2565b5ab-c381-4a01-bc51-98d00dc7ce25-kube-api-access-7s9sv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:33 crc kubenswrapper[4846]: I1122 09:36:33.953158 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.060292 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s9sv\" (UniqueName: \"kubernetes.io/projected/2565b5ab-c381-4a01-bc51-98d00dc7ce25-kube-api-access-7s9sv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.060535 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.060674 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.060712 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.076424 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.081538 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.084318 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s9sv\" (UniqueName: \"kubernetes.io/projected/2565b5ab-c381-4a01-bc51-98d00dc7ce25-kube-api-access-7s9sv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.092849 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.126594 4846 generic.go:334] "Generic (PLEG): container finished" podID="5b44e9aa-f202-48be-bace-279f29824c1b" containerID="e1052ddb5ea71a1adf42c7cdb2b5c40fe072033bf302012629987ed43f449a13" exitCode=0 Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.126660 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b44e9aa-f202-48be-bace-279f29824c1b","Type":"ContainerDied","Data":"e1052ddb5ea71a1adf42c7cdb2b5c40fe072033bf302012629987ed43f449a13"} Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.220567 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:36:34 crc kubenswrapper[4846]: I1122 09:36:34.840873 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn"] Nov 22 09:36:34 crc kubenswrapper[4846]: W1122 09:36:34.845145 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2565b5ab_c381_4a01_bc51_98d00dc7ce25.slice/crio-7a5425384d0ce6dc92e12392e00bc38515eee7948a16e8c55ef3c073781ed7cf WatchSource:0}: Error finding container 7a5425384d0ce6dc92e12392e00bc38515eee7948a16e8c55ef3c073781ed7cf: Status 404 returned error can't find the container with id 7a5425384d0ce6dc92e12392e00bc38515eee7948a16e8c55ef3c073781ed7cf Nov 22 09:36:35 crc kubenswrapper[4846]: I1122 09:36:35.139865 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" event={"ID":"2565b5ab-c381-4a01-bc51-98d00dc7ce25","Type":"ContainerStarted","Data":"7a5425384d0ce6dc92e12392e00bc38515eee7948a16e8c55ef3c073781ed7cf"} Nov 22 09:36:35 crc kubenswrapper[4846]: I1122 09:36:35.143556 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b44e9aa-f202-48be-bace-279f29824c1b","Type":"ContainerStarted","Data":"2031c3472ca396e382ac3897566514f46376cee74e32261a376af572cddb8e82"} Nov 22 09:36:35 crc kubenswrapper[4846]: I1122 09:36:35.143882 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 09:36:35 crc kubenswrapper[4846]: I1122 09:36:35.186356 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.18633757 podStartE2EDuration="38.18633757s" podCreationTimestamp="2025-11-22 09:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:36:35.176198475 +0000 UTC m=+1370.111888124" watchObservedRunningTime="2025-11-22 09:36:35.18633757 +0000 UTC m=+1370.122027219" Nov 22 09:36:36 crc kubenswrapper[4846]: I1122 09:36:36.180914 4846 generic.go:334] "Generic (PLEG): container finished" podID="812351d5-d992-4243-94c9-3328217b37b9" containerID="84beb3cadfa8922aed80b0bfae56bae3053b97cb2228377ce15980dc806cccaf" exitCode=0 Nov 22 09:36:36 crc kubenswrapper[4846]: I1122 09:36:36.181035 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"812351d5-d992-4243-94c9-3328217b37b9","Type":"ContainerDied","Data":"84beb3cadfa8922aed80b0bfae56bae3053b97cb2228377ce15980dc806cccaf"} Nov 22 09:36:37 crc kubenswrapper[4846]: I1122 09:36:37.196360 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"812351d5-d992-4243-94c9-3328217b37b9","Type":"ContainerStarted","Data":"8ebe820f63e2bdd9807db780105bc6067ff776c6cf1a9db5a414dd6a04276f8e"} Nov 22 09:36:37 crc kubenswrapper[4846]: I1122 09:36:37.197234 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:36:37 crc kubenswrapper[4846]: I1122 09:36:37.232594 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.232567926 podStartE2EDuration="39.232567926s" podCreationTimestamp="2025-11-22 09:35:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:36:37.223959256 +0000 UTC m=+1372.159648915" watchObservedRunningTime="2025-11-22 09:36:37.232567926 +0000 UTC m=+1372.168257575" Nov 22 09:36:46 crc kubenswrapper[4846]: I1122 09:36:46.113220 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:36:47 crc kubenswrapper[4846]: I1122 09:36:47.351040 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" event={"ID":"2565b5ab-c381-4a01-bc51-98d00dc7ce25","Type":"ContainerStarted","Data":"080a784befd21ab524547d4d225ce889545bd54b6f27e18639f0f998acdc443a"} Nov 22 09:36:47 crc kubenswrapper[4846]: I1122 09:36:47.384790 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" podStartSLOduration=3.123852231 podStartE2EDuration="14.384767576s" podCreationTimestamp="2025-11-22 09:36:33 +0000 UTC" firstStartedPulling="2025-11-22 09:36:34.849009104 +0000 UTC m=+1369.784698753" lastFinishedPulling="2025-11-22 09:36:46.109924449 +0000 UTC m=+1381.045614098" observedRunningTime="2025-11-22 09:36:47.380190123 +0000 UTC m=+1382.315879772" watchObservedRunningTime="2025-11-22 09:36:47.384767576 +0000 UTC m=+1382.320457225" Nov 22 09:36:48 crc kubenswrapper[4846]: I1122 09:36:48.190407 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 09:36:49 crc kubenswrapper[4846]: I1122 09:36:49.293486 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:36:53 crc kubenswrapper[4846]: I1122 09:36:53.879598 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hbcct"] Nov 22 09:36:53 crc kubenswrapper[4846]: I1122 09:36:53.883715 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:53 crc kubenswrapper[4846]: I1122 09:36:53.907769 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hbcct"] Nov 22 09:36:53 crc kubenswrapper[4846]: I1122 09:36:53.984382 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-catalog-content\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:53 crc kubenswrapper[4846]: I1122 09:36:53.984662 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-utilities\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:53 crc kubenswrapper[4846]: I1122 09:36:53.985010 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbcpt\" (UniqueName: \"kubernetes.io/projected/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-kube-api-access-dbcpt\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.087759 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-utilities\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.087927 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbcpt\" (UniqueName: \"kubernetes.io/projected/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-kube-api-access-dbcpt\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.088130 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-catalog-content\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.088555 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-catalog-content\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.089139 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-utilities\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.121252 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbcpt\" (UniqueName: \"kubernetes.io/projected/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-kube-api-access-dbcpt\") pod \"redhat-operators-hbcct\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.237177 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:36:54 crc kubenswrapper[4846]: I1122 09:36:54.768566 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hbcct"] Nov 22 09:36:55 crc kubenswrapper[4846]: I1122 09:36:55.471233 4846 generic.go:334] "Generic (PLEG): container finished" podID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerID="5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a" exitCode=0 Nov 22 09:36:55 crc kubenswrapper[4846]: I1122 09:36:55.471314 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerDied","Data":"5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a"} Nov 22 09:36:55 crc kubenswrapper[4846]: I1122 09:36:55.471363 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerStarted","Data":"a3b99d6fd8d746ed595b91825c1f114576f9aa11e8f5fe2927d5dbd61fd07f7c"} Nov 22 09:36:56 crc kubenswrapper[4846]: I1122 09:36:56.485274 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerStarted","Data":"2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8"} Nov 22 09:36:57 crc kubenswrapper[4846]: I1122 09:36:57.506788 4846 generic.go:334] "Generic (PLEG): container finished" podID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerID="2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8" exitCode=0 Nov 22 09:36:57 crc kubenswrapper[4846]: I1122 09:36:57.506968 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerDied","Data":"2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8"} Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.526810 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerStarted","Data":"dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce"} Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.528828 4846 generic.go:334] "Generic (PLEG): container finished" podID="2565b5ab-c381-4a01-bc51-98d00dc7ce25" containerID="080a784befd21ab524547d4d225ce889545bd54b6f27e18639f0f998acdc443a" exitCode=0 Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.528889 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" event={"ID":"2565b5ab-c381-4a01-bc51-98d00dc7ce25","Type":"ContainerDied","Data":"080a784befd21ab524547d4d225ce889545bd54b6f27e18639f0f998acdc443a"} Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.553846 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hbcct" podStartSLOduration=3.086069829 podStartE2EDuration="5.553825665s" podCreationTimestamp="2025-11-22 09:36:53 +0000 UTC" firstStartedPulling="2025-11-22 09:36:55.47552599 +0000 UTC m=+1390.411215639" lastFinishedPulling="2025-11-22 09:36:57.943281806 +0000 UTC m=+1392.878971475" observedRunningTime="2025-11-22 09:36:58.55020663 +0000 UTC m=+1393.485896279" watchObservedRunningTime="2025-11-22 09:36:58.553825665 +0000 UTC m=+1393.489515314" Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.626793 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.626872 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.626932 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.628121 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a14897a35386470f39071e84723014ffd191c85c1c0f4368970f8ed940d4ab69"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:36:58 crc kubenswrapper[4846]: I1122 09:36:58.628203 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://a14897a35386470f39071e84723014ffd191c85c1c0f4368970f8ed940d4ab69" gracePeriod=600 Nov 22 09:36:59 crc kubenswrapper[4846]: I1122 09:36:59.545851 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="a14897a35386470f39071e84723014ffd191c85c1c0f4368970f8ed940d4ab69" exitCode=0 Nov 22 09:36:59 crc kubenswrapper[4846]: I1122 09:36:59.545912 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"a14897a35386470f39071e84723014ffd191c85c1c0f4368970f8ed940d4ab69"} Nov 22 09:36:59 crc kubenswrapper[4846]: I1122 09:36:59.546390 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b"} Nov 22 09:36:59 crc kubenswrapper[4846]: I1122 09:36:59.546416 4846 scope.go:117] "RemoveContainer" containerID="cf9936e32ada96756d2d63284f53d35f1bafde25a492c2c86fd57715fcf497eb" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.044593 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.163229 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-repo-setup-combined-ca-bundle\") pod \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.163829 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-inventory\") pod \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.163948 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-ssh-key\") pod \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.164027 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s9sv\" (UniqueName: \"kubernetes.io/projected/2565b5ab-c381-4a01-bc51-98d00dc7ce25-kube-api-access-7s9sv\") pod \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\" (UID: \"2565b5ab-c381-4a01-bc51-98d00dc7ce25\") " Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.170861 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2565b5ab-c381-4a01-bc51-98d00dc7ce25" (UID: "2565b5ab-c381-4a01-bc51-98d00dc7ce25"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.171137 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2565b5ab-c381-4a01-bc51-98d00dc7ce25-kube-api-access-7s9sv" (OuterVolumeSpecName: "kube-api-access-7s9sv") pod "2565b5ab-c381-4a01-bc51-98d00dc7ce25" (UID: "2565b5ab-c381-4a01-bc51-98d00dc7ce25"). InnerVolumeSpecName "kube-api-access-7s9sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.195265 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2565b5ab-c381-4a01-bc51-98d00dc7ce25" (UID: "2565b5ab-c381-4a01-bc51-98d00dc7ce25"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.217011 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-inventory" (OuterVolumeSpecName: "inventory") pod "2565b5ab-c381-4a01-bc51-98d00dc7ce25" (UID: "2565b5ab-c381-4a01-bc51-98d00dc7ce25"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.266994 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.267059 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s9sv\" (UniqueName: \"kubernetes.io/projected/2565b5ab-c381-4a01-bc51-98d00dc7ce25-kube-api-access-7s9sv\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.267083 4846 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.267105 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2565b5ab-c381-4a01-bc51-98d00dc7ce25-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.568594 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" event={"ID":"2565b5ab-c381-4a01-bc51-98d00dc7ce25","Type":"ContainerDied","Data":"7a5425384d0ce6dc92e12392e00bc38515eee7948a16e8c55ef3c073781ed7cf"} Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.568656 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.568677 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5425384d0ce6dc92e12392e00bc38515eee7948a16e8c55ef3c073781ed7cf" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.722230 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6"] Nov 22 09:37:00 crc kubenswrapper[4846]: E1122 09:37:00.723037 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2565b5ab-c381-4a01-bc51-98d00dc7ce25" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.723068 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2565b5ab-c381-4a01-bc51-98d00dc7ce25" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.723256 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2565b5ab-c381-4a01-bc51-98d00dc7ce25" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.723936 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.726284 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.726462 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.727000 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.727738 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.750117 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6"] Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.885706 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.885905 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.885944 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9bg7\" (UniqueName: \"kubernetes.io/projected/da832f40-8579-415e-82c8-3e66684eb241-kube-api-access-n9bg7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.988421 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.988576 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.988603 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9bg7\" (UniqueName: \"kubernetes.io/projected/da832f40-8579-415e-82c8-3e66684eb241-kube-api-access-n9bg7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.993992 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:00 crc kubenswrapper[4846]: I1122 09:37:00.995711 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:01 crc kubenswrapper[4846]: I1122 09:37:01.011935 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9bg7\" (UniqueName: \"kubernetes.io/projected/da832f40-8579-415e-82c8-3e66684eb241-kube-api-access-n9bg7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-dmvg6\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:01 crc kubenswrapper[4846]: I1122 09:37:01.042398 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:02 crc kubenswrapper[4846]: I1122 09:37:02.292454 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6"] Nov 22 09:37:02 crc kubenswrapper[4846]: W1122 09:37:02.294726 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda832f40_8579_415e_82c8_3e66684eb241.slice/crio-e705ce376956728079d58c9732b3f0ab5fffa1808d8c8d211b39dbfd92144940 WatchSource:0}: Error finding container e705ce376956728079d58c9732b3f0ab5fffa1808d8c8d211b39dbfd92144940: Status 404 returned error can't find the container with id e705ce376956728079d58c9732b3f0ab5fffa1808d8c8d211b39dbfd92144940 Nov 22 09:37:02 crc kubenswrapper[4846]: I1122 09:37:02.595379 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" event={"ID":"da832f40-8579-415e-82c8-3e66684eb241","Type":"ContainerStarted","Data":"e705ce376956728079d58c9732b3f0ab5fffa1808d8c8d211b39dbfd92144940"} Nov 22 09:37:03 crc kubenswrapper[4846]: I1122 09:37:03.607478 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" event={"ID":"da832f40-8579-415e-82c8-3e66684eb241","Type":"ContainerStarted","Data":"49db78bbe8c83f0a6c85d7a337f83ae09a3fb77628d581c8fe4cd2f349622239"} Nov 22 09:37:03 crc kubenswrapper[4846]: I1122 09:37:03.634558 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" podStartSLOduration=3.030939828 podStartE2EDuration="3.634532135s" podCreationTimestamp="2025-11-22 09:37:00 +0000 UTC" firstStartedPulling="2025-11-22 09:37:02.308651565 +0000 UTC m=+1397.244341214" lastFinishedPulling="2025-11-22 09:37:02.912243862 +0000 UTC m=+1397.847933521" observedRunningTime="2025-11-22 09:37:03.626533823 +0000 UTC m=+1398.562223472" watchObservedRunningTime="2025-11-22 09:37:03.634532135 +0000 UTC m=+1398.570221784" Nov 22 09:37:04 crc kubenswrapper[4846]: I1122 09:37:04.237785 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:37:04 crc kubenswrapper[4846]: I1122 09:37:04.237836 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:37:05 crc kubenswrapper[4846]: I1122 09:37:05.322183 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hbcct" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="registry-server" probeResult="failure" output=< Nov 22 09:37:05 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:37:05 crc kubenswrapper[4846]: > Nov 22 09:37:06 crc kubenswrapper[4846]: I1122 09:37:06.649901 4846 generic.go:334] "Generic (PLEG): container finished" podID="da832f40-8579-415e-82c8-3e66684eb241" containerID="49db78bbe8c83f0a6c85d7a337f83ae09a3fb77628d581c8fe4cd2f349622239" exitCode=0 Nov 22 09:37:06 crc kubenswrapper[4846]: I1122 09:37:06.650007 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" event={"ID":"da832f40-8579-415e-82c8-3e66684eb241","Type":"ContainerDied","Data":"49db78bbe8c83f0a6c85d7a337f83ae09a3fb77628d581c8fe4cd2f349622239"} Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.292377 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.409494 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-ssh-key\") pod \"da832f40-8579-415e-82c8-3e66684eb241\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.409550 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9bg7\" (UniqueName: \"kubernetes.io/projected/da832f40-8579-415e-82c8-3e66684eb241-kube-api-access-n9bg7\") pod \"da832f40-8579-415e-82c8-3e66684eb241\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.409667 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-inventory\") pod \"da832f40-8579-415e-82c8-3e66684eb241\" (UID: \"da832f40-8579-415e-82c8-3e66684eb241\") " Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.418190 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da832f40-8579-415e-82c8-3e66684eb241-kube-api-access-n9bg7" (OuterVolumeSpecName: "kube-api-access-n9bg7") pod "da832f40-8579-415e-82c8-3e66684eb241" (UID: "da832f40-8579-415e-82c8-3e66684eb241"). InnerVolumeSpecName "kube-api-access-n9bg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.457716 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-inventory" (OuterVolumeSpecName: "inventory") pod "da832f40-8579-415e-82c8-3e66684eb241" (UID: "da832f40-8579-415e-82c8-3e66684eb241"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.478290 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da832f40-8579-415e-82c8-3e66684eb241" (UID: "da832f40-8579-415e-82c8-3e66684eb241"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.513031 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.513105 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9bg7\" (UniqueName: \"kubernetes.io/projected/da832f40-8579-415e-82c8-3e66684eb241-kube-api-access-n9bg7\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.513128 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da832f40-8579-415e-82c8-3e66684eb241-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.683187 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" event={"ID":"da832f40-8579-415e-82c8-3e66684eb241","Type":"ContainerDied","Data":"e705ce376956728079d58c9732b3f0ab5fffa1808d8c8d211b39dbfd92144940"} Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.683238 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e705ce376956728079d58c9732b3f0ab5fffa1808d8c8d211b39dbfd92144940" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.683324 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-dmvg6" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.787849 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw"] Nov 22 09:37:08 crc kubenswrapper[4846]: E1122 09:37:08.788387 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da832f40-8579-415e-82c8-3e66684eb241" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.788408 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="da832f40-8579-415e-82c8-3e66684eb241" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.788665 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="da832f40-8579-415e-82c8-3e66684eb241" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.789457 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.796249 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.796484 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.796719 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.797338 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.819069 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw"] Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.823665 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.823781 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.823923 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4ws9\" (UniqueName: \"kubernetes.io/projected/2b50be33-843f-4f51-af42-decfb29306c4-kube-api-access-r4ws9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.823976 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.924795 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4ws9\" (UniqueName: \"kubernetes.io/projected/2b50be33-843f-4f51-af42-decfb29306c4-kube-api-access-r4ws9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.924853 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.924894 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.924947 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.930633 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.930662 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.933788 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:08 crc kubenswrapper[4846]: I1122 09:37:08.950917 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4ws9\" (UniqueName: \"kubernetes.io/projected/2b50be33-843f-4f51-af42-decfb29306c4-kube-api-access-r4ws9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:09 crc kubenswrapper[4846]: I1122 09:37:09.131221 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:37:09 crc kubenswrapper[4846]: I1122 09:37:09.793554 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw"] Nov 22 09:37:09 crc kubenswrapper[4846]: W1122 09:37:09.803136 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b50be33_843f_4f51_af42_decfb29306c4.slice/crio-29c19da554e33f9dddb4ac967965c07ef70a83b536716a603b67df7bae95ebaf WatchSource:0}: Error finding container 29c19da554e33f9dddb4ac967965c07ef70a83b536716a603b67df7bae95ebaf: Status 404 returned error can't find the container with id 29c19da554e33f9dddb4ac967965c07ef70a83b536716a603b67df7bae95ebaf Nov 22 09:37:10 crc kubenswrapper[4846]: I1122 09:37:10.711071 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" event={"ID":"2b50be33-843f-4f51-af42-decfb29306c4","Type":"ContainerStarted","Data":"29c19da554e33f9dddb4ac967965c07ef70a83b536716a603b67df7bae95ebaf"} Nov 22 09:37:11 crc kubenswrapper[4846]: I1122 09:37:11.725473 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" event={"ID":"2b50be33-843f-4f51-af42-decfb29306c4","Type":"ContainerStarted","Data":"873a063aa4b928da6c6f56af7630cf2b10a9aec23af1c985858078b0c8c31329"} Nov 22 09:37:11 crc kubenswrapper[4846]: I1122 09:37:11.754559 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" podStartSLOduration=3.049793654 podStartE2EDuration="3.754531618s" podCreationTimestamp="2025-11-22 09:37:08 +0000 UTC" firstStartedPulling="2025-11-22 09:37:09.808638974 +0000 UTC m=+1404.744328663" lastFinishedPulling="2025-11-22 09:37:10.513376938 +0000 UTC m=+1405.449066627" observedRunningTime="2025-11-22 09:37:11.752236481 +0000 UTC m=+1406.687926130" watchObservedRunningTime="2025-11-22 09:37:11.754531618 +0000 UTC m=+1406.690221307" Nov 22 09:37:12 crc kubenswrapper[4846]: I1122 09:37:12.915425 4846 scope.go:117] "RemoveContainer" containerID="a6b00086499f1c5f5e7ccc2f8cac2b93885368d78544599d9c6f1a835fe131fc" Nov 22 09:37:14 crc kubenswrapper[4846]: I1122 09:37:14.303681 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:37:14 crc kubenswrapper[4846]: I1122 09:37:14.358077 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:37:14 crc kubenswrapper[4846]: I1122 09:37:14.551244 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hbcct"] Nov 22 09:37:15 crc kubenswrapper[4846]: I1122 09:37:15.774875 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hbcct" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="registry-server" containerID="cri-o://dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce" gracePeriod=2 Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.320362 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.341637 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-utilities\") pod \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.341733 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-catalog-content\") pod \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.341804 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbcpt\" (UniqueName: \"kubernetes.io/projected/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-kube-api-access-dbcpt\") pod \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\" (UID: \"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220\") " Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.343458 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-utilities" (OuterVolumeSpecName: "utilities") pod "ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" (UID: "ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.352628 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-kube-api-access-dbcpt" (OuterVolumeSpecName: "kube-api-access-dbcpt") pod "ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" (UID: "ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220"). InnerVolumeSpecName "kube-api-access-dbcpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.444990 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbcpt\" (UniqueName: \"kubernetes.io/projected/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-kube-api-access-dbcpt\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.445039 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.449367 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" (UID: "ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.547234 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.791004 4846 generic.go:334] "Generic (PLEG): container finished" podID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerID="dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce" exitCode=0 Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.791099 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerDied","Data":"dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce"} Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.791180 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbcct" event={"ID":"ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220","Type":"ContainerDied","Data":"a3b99d6fd8d746ed595b91825c1f114576f9aa11e8f5fe2927d5dbd61fd07f7c"} Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.791214 4846 scope.go:117] "RemoveContainer" containerID="dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.791201 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbcct" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.839397 4846 scope.go:117] "RemoveContainer" containerID="2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.843394 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hbcct"] Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.865262 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hbcct"] Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.898851 4846 scope.go:117] "RemoveContainer" containerID="5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.936107 4846 scope.go:117] "RemoveContainer" containerID="dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce" Nov 22 09:37:16 crc kubenswrapper[4846]: E1122 09:37:16.936773 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce\": container with ID starting with dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce not found: ID does not exist" containerID="dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.936877 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce"} err="failed to get container status \"dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce\": rpc error: code = NotFound desc = could not find container \"dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce\": container with ID starting with dcd26a6262d27d7b1d6db390b117863c620d47f8280b79cf9bd5bf07538c30ce not found: ID does not exist" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.936938 4846 scope.go:117] "RemoveContainer" containerID="2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8" Nov 22 09:37:16 crc kubenswrapper[4846]: E1122 09:37:16.937465 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8\": container with ID starting with 2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8 not found: ID does not exist" containerID="2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.937524 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8"} err="failed to get container status \"2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8\": rpc error: code = NotFound desc = could not find container \"2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8\": container with ID starting with 2c1f675245c39fb78e2527d3fbf16c913cfa27fffba2b0885907d9f4efb0eeb8 not found: ID does not exist" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.937568 4846 scope.go:117] "RemoveContainer" containerID="5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a" Nov 22 09:37:16 crc kubenswrapper[4846]: E1122 09:37:16.938221 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a\": container with ID starting with 5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a not found: ID does not exist" containerID="5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a" Nov 22 09:37:16 crc kubenswrapper[4846]: I1122 09:37:16.938293 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a"} err="failed to get container status \"5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a\": rpc error: code = NotFound desc = could not find container \"5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a\": container with ID starting with 5895883df7c951e66bb62e7cc3de21ec5cb4147afe5691a599a305bb29f3946a not found: ID does not exist" Nov 22 09:37:18 crc kubenswrapper[4846]: I1122 09:37:18.055202 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" path="/var/lib/kubelet/pods/ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220/volumes" Nov 22 09:38:13 crc kubenswrapper[4846]: I1122 09:38:13.036769 4846 scope.go:117] "RemoveContainer" containerID="bdb46927f8a6f6c0c5bef27f83b8cbbe2bf27a096578db3a382af5a48f60fd4a" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.647299 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w25q"] Nov 22 09:38:21 crc kubenswrapper[4846]: E1122 09:38:21.648417 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="extract-utilities" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.648432 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="extract-utilities" Nov 22 09:38:21 crc kubenswrapper[4846]: E1122 09:38:21.648458 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="extract-content" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.648467 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="extract-content" Nov 22 09:38:21 crc kubenswrapper[4846]: E1122 09:38:21.648486 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="registry-server" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.648494 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="registry-server" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.648739 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea70ed5f-abf7-4f0d-a67a-03a6a7a0e220" containerName="registry-server" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.650601 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.674733 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w25q"] Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.740790 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-catalog-content\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.740882 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-utilities\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.740951 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4dc\" (UniqueName: \"kubernetes.io/projected/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-kube-api-access-dx4dc\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.843350 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-catalog-content\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.843428 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-utilities\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.843483 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4dc\" (UniqueName: \"kubernetes.io/projected/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-kube-api-access-dx4dc\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.844404 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-catalog-content\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.844689 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-utilities\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.863941 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4dc\" (UniqueName: \"kubernetes.io/projected/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-kube-api-access-dx4dc\") pod \"community-operators-9w25q\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:21 crc kubenswrapper[4846]: I1122 09:38:21.975971 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:22 crc kubenswrapper[4846]: I1122 09:38:22.552011 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w25q"] Nov 22 09:38:22 crc kubenswrapper[4846]: I1122 09:38:22.821203 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w25q" event={"ID":"b5e26117-c9a4-4083-bcad-51d6c7dcddaf","Type":"ContainerStarted","Data":"8b0340a8fbcf23b65fd902cb5afe7385db03f080bd2080fb9e922fd82b7891b0"} Nov 22 09:38:23 crc kubenswrapper[4846]: I1122 09:38:23.837104 4846 generic.go:334] "Generic (PLEG): container finished" podID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerID="8e318212baf96adea87c75543e27940165fe5d9c9e236daeb5ae0973aa1e69c0" exitCode=0 Nov 22 09:38:23 crc kubenswrapper[4846]: I1122 09:38:23.837378 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w25q" event={"ID":"b5e26117-c9a4-4083-bcad-51d6c7dcddaf","Type":"ContainerDied","Data":"8e318212baf96adea87c75543e27940165fe5d9c9e236daeb5ae0973aa1e69c0"} Nov 22 09:38:25 crc kubenswrapper[4846]: I1122 09:38:25.870289 4846 generic.go:334] "Generic (PLEG): container finished" podID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerID="c685d32059da6af4057a6d4e5651aed348e2f8805c1f9dd6dcf218e5d6241cc1" exitCode=0 Nov 22 09:38:25 crc kubenswrapper[4846]: I1122 09:38:25.870931 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w25q" event={"ID":"b5e26117-c9a4-4083-bcad-51d6c7dcddaf","Type":"ContainerDied","Data":"c685d32059da6af4057a6d4e5651aed348e2f8805c1f9dd6dcf218e5d6241cc1"} Nov 22 09:38:26 crc kubenswrapper[4846]: I1122 09:38:26.884017 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w25q" event={"ID":"b5e26117-c9a4-4083-bcad-51d6c7dcddaf","Type":"ContainerStarted","Data":"646e1bdfd26b276ae8d30b34ab3c84b0cacd6ac6b8972d22f613aa3b926f54dc"} Nov 22 09:38:26 crc kubenswrapper[4846]: I1122 09:38:26.915717 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w25q" podStartSLOduration=3.462720645 podStartE2EDuration="5.915688061s" podCreationTimestamp="2025-11-22 09:38:21 +0000 UTC" firstStartedPulling="2025-11-22 09:38:23.840224249 +0000 UTC m=+1478.775913908" lastFinishedPulling="2025-11-22 09:38:26.293191675 +0000 UTC m=+1481.228881324" observedRunningTime="2025-11-22 09:38:26.913129446 +0000 UTC m=+1481.848819135" watchObservedRunningTime="2025-11-22 09:38:26.915688061 +0000 UTC m=+1481.851377750" Nov 22 09:38:31 crc kubenswrapper[4846]: I1122 09:38:31.976733 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:31 crc kubenswrapper[4846]: I1122 09:38:31.977403 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:32 crc kubenswrapper[4846]: I1122 09:38:32.069893 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:33 crc kubenswrapper[4846]: I1122 09:38:33.055387 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:33 crc kubenswrapper[4846]: I1122 09:38:33.127105 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w25q"] Nov 22 09:38:34 crc kubenswrapper[4846]: I1122 09:38:34.986406 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9w25q" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="registry-server" containerID="cri-o://646e1bdfd26b276ae8d30b34ab3c84b0cacd6ac6b8972d22f613aa3b926f54dc" gracePeriod=2 Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.001724 4846 generic.go:334] "Generic (PLEG): container finished" podID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerID="646e1bdfd26b276ae8d30b34ab3c84b0cacd6ac6b8972d22f613aa3b926f54dc" exitCode=0 Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.001795 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w25q" event={"ID":"b5e26117-c9a4-4083-bcad-51d6c7dcddaf","Type":"ContainerDied","Data":"646e1bdfd26b276ae8d30b34ab3c84b0cacd6ac6b8972d22f613aa3b926f54dc"} Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.614510 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.750067 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-catalog-content\") pod \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.750235 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx4dc\" (UniqueName: \"kubernetes.io/projected/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-kube-api-access-dx4dc\") pod \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.750334 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-utilities\") pod \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\" (UID: \"b5e26117-c9a4-4083-bcad-51d6c7dcddaf\") " Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.751936 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-utilities" (OuterVolumeSpecName: "utilities") pod "b5e26117-c9a4-4083-bcad-51d6c7dcddaf" (UID: "b5e26117-c9a4-4083-bcad-51d6c7dcddaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.760774 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-kube-api-access-dx4dc" (OuterVolumeSpecName: "kube-api-access-dx4dc") pod "b5e26117-c9a4-4083-bcad-51d6c7dcddaf" (UID: "b5e26117-c9a4-4083-bcad-51d6c7dcddaf"). InnerVolumeSpecName "kube-api-access-dx4dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.807159 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5e26117-c9a4-4083-bcad-51d6c7dcddaf" (UID: "b5e26117-c9a4-4083-bcad-51d6c7dcddaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.852754 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.852785 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx4dc\" (UniqueName: \"kubernetes.io/projected/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-kube-api-access-dx4dc\") on node \"crc\" DevicePath \"\"" Nov 22 09:38:36 crc kubenswrapper[4846]: I1122 09:38:36.852799 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5e26117-c9a4-4083-bcad-51d6c7dcddaf-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.013638 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w25q" event={"ID":"b5e26117-c9a4-4083-bcad-51d6c7dcddaf","Type":"ContainerDied","Data":"8b0340a8fbcf23b65fd902cb5afe7385db03f080bd2080fb9e922fd82b7891b0"} Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.015221 4846 scope.go:117] "RemoveContainer" containerID="646e1bdfd26b276ae8d30b34ab3c84b0cacd6ac6b8972d22f613aa3b926f54dc" Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.013760 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w25q" Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.048300 4846 scope.go:117] "RemoveContainer" containerID="c685d32059da6af4057a6d4e5651aed348e2f8805c1f9dd6dcf218e5d6241cc1" Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.055275 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w25q"] Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.071779 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9w25q"] Nov 22 09:38:37 crc kubenswrapper[4846]: I1122 09:38:37.088845 4846 scope.go:117] "RemoveContainer" containerID="8e318212baf96adea87c75543e27940165fe5d9c9e236daeb5ae0973aa1e69c0" Nov 22 09:38:38 crc kubenswrapper[4846]: I1122 09:38:38.062142 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" path="/var/lib/kubelet/pods/b5e26117-c9a4-4083-bcad-51d6c7dcddaf/volumes" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.527307 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvsz"] Nov 22 09:38:50 crc kubenswrapper[4846]: E1122 09:38:50.528650 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="registry-server" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.528669 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="registry-server" Nov 22 09:38:50 crc kubenswrapper[4846]: E1122 09:38:50.528713 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="extract-utilities" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.528723 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="extract-utilities" Nov 22 09:38:50 crc kubenswrapper[4846]: E1122 09:38:50.528743 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="extract-content" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.528751 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="extract-content" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.528995 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e26117-c9a4-4083-bcad-51d6c7dcddaf" containerName="registry-server" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.530536 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.560392 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvsz"] Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.586916 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-catalog-content\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.587240 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-utilities\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.587282 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7vx\" (UniqueName: \"kubernetes.io/projected/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-kube-api-access-zf7vx\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.689199 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-catalog-content\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.689688 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-utilities\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.689841 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7vx\" (UniqueName: \"kubernetes.io/projected/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-kube-api-access-zf7vx\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.689928 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-catalog-content\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.690557 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-utilities\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.721530 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7vx\" (UniqueName: \"kubernetes.io/projected/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-kube-api-access-zf7vx\") pod \"redhat-marketplace-dvvsz\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:50 crc kubenswrapper[4846]: I1122 09:38:50.879038 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:38:51 crc kubenswrapper[4846]: I1122 09:38:51.471413 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvsz"] Nov 22 09:38:52 crc kubenswrapper[4846]: I1122 09:38:52.208112 4846 generic.go:334] "Generic (PLEG): container finished" podID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerID="85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536" exitCode=0 Nov 22 09:38:52 crc kubenswrapper[4846]: I1122 09:38:52.208272 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerDied","Data":"85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536"} Nov 22 09:38:52 crc kubenswrapper[4846]: I1122 09:38:52.208625 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerStarted","Data":"6aad9e97bf0acdbc18fc713470858481c96ee175686ffd5a92369bc3fbbd946e"} Nov 22 09:38:53 crc kubenswrapper[4846]: I1122 09:38:53.222453 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerStarted","Data":"4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd"} Nov 22 09:38:54 crc kubenswrapper[4846]: I1122 09:38:54.236241 4846 generic.go:334] "Generic (PLEG): container finished" podID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerID="4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd" exitCode=0 Nov 22 09:38:54 crc kubenswrapper[4846]: I1122 09:38:54.236373 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerDied","Data":"4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd"} Nov 22 09:38:55 crc kubenswrapper[4846]: I1122 09:38:55.252498 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerStarted","Data":"f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b"} Nov 22 09:38:55 crc kubenswrapper[4846]: I1122 09:38:55.288226 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dvvsz" podStartSLOduration=2.834069229 podStartE2EDuration="5.288197673s" podCreationTimestamp="2025-11-22 09:38:50 +0000 UTC" firstStartedPulling="2025-11-22 09:38:52.210837408 +0000 UTC m=+1507.146527077" lastFinishedPulling="2025-11-22 09:38:54.664965832 +0000 UTC m=+1509.600655521" observedRunningTime="2025-11-22 09:38:55.280040831 +0000 UTC m=+1510.215730480" watchObservedRunningTime="2025-11-22 09:38:55.288197673 +0000 UTC m=+1510.223887362" Nov 22 09:39:00 crc kubenswrapper[4846]: I1122 09:39:00.879706 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:39:00 crc kubenswrapper[4846]: I1122 09:39:00.880589 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:39:00 crc kubenswrapper[4846]: I1122 09:39:00.977210 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:39:01 crc kubenswrapper[4846]: I1122 09:39:01.440516 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:39:01 crc kubenswrapper[4846]: I1122 09:39:01.526117 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvsz"] Nov 22 09:39:03 crc kubenswrapper[4846]: I1122 09:39:03.409333 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dvvsz" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="registry-server" containerID="cri-o://f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b" gracePeriod=2 Nov 22 09:39:03 crc kubenswrapper[4846]: I1122 09:39:03.939466 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.063190 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf7vx\" (UniqueName: \"kubernetes.io/projected/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-kube-api-access-zf7vx\") pod \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.063337 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-utilities\") pod \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.063403 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-catalog-content\") pod \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\" (UID: \"8d984f45-3c41-4fa6-a291-90eec6f9e3d4\") " Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.064531 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-utilities" (OuterVolumeSpecName: "utilities") pod "8d984f45-3c41-4fa6-a291-90eec6f9e3d4" (UID: "8d984f45-3c41-4fa6-a291-90eec6f9e3d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.076323 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-kube-api-access-zf7vx" (OuterVolumeSpecName: "kube-api-access-zf7vx") pod "8d984f45-3c41-4fa6-a291-90eec6f9e3d4" (UID: "8d984f45-3c41-4fa6-a291-90eec6f9e3d4"). InnerVolumeSpecName "kube-api-access-zf7vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.082805 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d984f45-3c41-4fa6-a291-90eec6f9e3d4" (UID: "8d984f45-3c41-4fa6-a291-90eec6f9e3d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.169943 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf7vx\" (UniqueName: \"kubernetes.io/projected/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-kube-api-access-zf7vx\") on node \"crc\" DevicePath \"\"" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.170000 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.170023 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d984f45-3c41-4fa6-a291-90eec6f9e3d4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.425224 4846 generic.go:334] "Generic (PLEG): container finished" podID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerID="f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b" exitCode=0 Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.425319 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerDied","Data":"f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b"} Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.425381 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvvsz" event={"ID":"8d984f45-3c41-4fa6-a291-90eec6f9e3d4","Type":"ContainerDied","Data":"6aad9e97bf0acdbc18fc713470858481c96ee175686ffd5a92369bc3fbbd946e"} Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.425426 4846 scope.go:117] "RemoveContainer" containerID="f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.425797 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvvsz" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.462717 4846 scope.go:117] "RemoveContainer" containerID="4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.492290 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvsz"] Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.492593 4846 scope.go:117] "RemoveContainer" containerID="85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.509013 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvvsz"] Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.583966 4846 scope.go:117] "RemoveContainer" containerID="f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b" Nov 22 09:39:04 crc kubenswrapper[4846]: E1122 09:39:04.586607 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b\": container with ID starting with f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b not found: ID does not exist" containerID="f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.586677 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b"} err="failed to get container status \"f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b\": rpc error: code = NotFound desc = could not find container \"f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b\": container with ID starting with f3ae621ffad6285c18ad85de4c034bb93e21d32db15889a88b82c77f0dd0835b not found: ID does not exist" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.586732 4846 scope.go:117] "RemoveContainer" containerID="4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd" Nov 22 09:39:04 crc kubenswrapper[4846]: E1122 09:39:04.587753 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd\": container with ID starting with 4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd not found: ID does not exist" containerID="4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.587807 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd"} err="failed to get container status \"4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd\": rpc error: code = NotFound desc = could not find container \"4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd\": container with ID starting with 4718c1477d4583c8eb03c022744f70b641393367ea69c776d6ba333dbf42d2dd not found: ID does not exist" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.587857 4846 scope.go:117] "RemoveContainer" containerID="85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536" Nov 22 09:39:04 crc kubenswrapper[4846]: E1122 09:39:04.588980 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536\": container with ID starting with 85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536 not found: ID does not exist" containerID="85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536" Nov 22 09:39:04 crc kubenswrapper[4846]: I1122 09:39:04.589022 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536"} err="failed to get container status \"85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536\": rpc error: code = NotFound desc = could not find container \"85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536\": container with ID starting with 85c0c48be68eb1c71a2444805a013c7b2708fc2133cd0b7a573dc1eeeb1c5536 not found: ID does not exist" Nov 22 09:39:06 crc kubenswrapper[4846]: I1122 09:39:06.086432 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" path="/var/lib/kubelet/pods/8d984f45-3c41-4fa6-a291-90eec6f9e3d4/volumes" Nov 22 09:39:28 crc kubenswrapper[4846]: I1122 09:39:28.626103 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:39:28 crc kubenswrapper[4846]: I1122 09:39:28.626940 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:39:58 crc kubenswrapper[4846]: I1122 09:39:58.626123 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:39:58 crc kubenswrapper[4846]: I1122 09:39:58.627099 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:40:17 crc kubenswrapper[4846]: I1122 09:40:17.383821 4846 generic.go:334] "Generic (PLEG): container finished" podID="2b50be33-843f-4f51-af42-decfb29306c4" containerID="873a063aa4b928da6c6f56af7630cf2b10a9aec23af1c985858078b0c8c31329" exitCode=0 Nov 22 09:40:17 crc kubenswrapper[4846]: I1122 09:40:17.383932 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" event={"ID":"2b50be33-843f-4f51-af42-decfb29306c4","Type":"ContainerDied","Data":"873a063aa4b928da6c6f56af7630cf2b10a9aec23af1c985858078b0c8c31329"} Nov 22 09:40:18 crc kubenswrapper[4846]: I1122 09:40:18.896858 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.007399 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-inventory\") pod \"2b50be33-843f-4f51-af42-decfb29306c4\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.007513 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-bootstrap-combined-ca-bundle\") pod \"2b50be33-843f-4f51-af42-decfb29306c4\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.007555 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-ssh-key\") pod \"2b50be33-843f-4f51-af42-decfb29306c4\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.007777 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4ws9\" (UniqueName: \"kubernetes.io/projected/2b50be33-843f-4f51-af42-decfb29306c4-kube-api-access-r4ws9\") pod \"2b50be33-843f-4f51-af42-decfb29306c4\" (UID: \"2b50be33-843f-4f51-af42-decfb29306c4\") " Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.015286 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2b50be33-843f-4f51-af42-decfb29306c4" (UID: "2b50be33-843f-4f51-af42-decfb29306c4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.015305 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b50be33-843f-4f51-af42-decfb29306c4-kube-api-access-r4ws9" (OuterVolumeSpecName: "kube-api-access-r4ws9") pod "2b50be33-843f-4f51-af42-decfb29306c4" (UID: "2b50be33-843f-4f51-af42-decfb29306c4"). InnerVolumeSpecName "kube-api-access-r4ws9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.038652 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-inventory" (OuterVolumeSpecName: "inventory") pod "2b50be33-843f-4f51-af42-decfb29306c4" (UID: "2b50be33-843f-4f51-af42-decfb29306c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.055465 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b50be33-843f-4f51-af42-decfb29306c4" (UID: "2b50be33-843f-4f51-af42-decfb29306c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.110483 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4ws9\" (UniqueName: \"kubernetes.io/projected/2b50be33-843f-4f51-af42-decfb29306c4-kube-api-access-r4ws9\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.110530 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.110547 4846 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.110561 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b50be33-843f-4f51-af42-decfb29306c4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.430272 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" event={"ID":"2b50be33-843f-4f51-af42-decfb29306c4","Type":"ContainerDied","Data":"29c19da554e33f9dddb4ac967965c07ef70a83b536716a603b67df7bae95ebaf"} Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.430328 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29c19da554e33f9dddb4ac967965c07ef70a83b536716a603b67df7bae95ebaf" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.430342 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.533838 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk"] Nov 22 09:40:19 crc kubenswrapper[4846]: E1122 09:40:19.534419 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="extract-utilities" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.534442 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="extract-utilities" Nov 22 09:40:19 crc kubenswrapper[4846]: E1122 09:40:19.534487 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="extract-content" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.534496 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="extract-content" Nov 22 09:40:19 crc kubenswrapper[4846]: E1122 09:40:19.534509 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="registry-server" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.534517 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="registry-server" Nov 22 09:40:19 crc kubenswrapper[4846]: E1122 09:40:19.534546 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b50be33-843f-4f51-af42-decfb29306c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.534556 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b50be33-843f-4f51-af42-decfb29306c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.534791 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b50be33-843f-4f51-af42-decfb29306c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.534811 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d984f45-3c41-4fa6-a291-90eec6f9e3d4" containerName="registry-server" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.535720 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.539436 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.539594 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.540514 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.542717 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.557597 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk"] Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.621296 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5fj\" (UniqueName: \"kubernetes.io/projected/ee2ff4f5-0353-438b-850b-81b49a3d22ad-kube-api-access-dc5fj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.621852 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.622174 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.724982 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.725249 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.725458 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5fj\" (UniqueName: \"kubernetes.io/projected/ee2ff4f5-0353-438b-850b-81b49a3d22ad-kube-api-access-dc5fj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.731760 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.736416 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.758870 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5fj\" (UniqueName: \"kubernetes.io/projected/ee2ff4f5-0353-438b-850b-81b49a3d22ad-kube-api-access-dc5fj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-grpjk\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:19 crc kubenswrapper[4846]: I1122 09:40:19.864615 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:40:20 crc kubenswrapper[4846]: I1122 09:40:20.525511 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk"] Nov 22 09:40:20 crc kubenswrapper[4846]: I1122 09:40:20.527454 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:40:21 crc kubenswrapper[4846]: I1122 09:40:21.477688 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" event={"ID":"ee2ff4f5-0353-438b-850b-81b49a3d22ad","Type":"ContainerStarted","Data":"dbf277bd085e0eb994f1a0703c1f31d93ee95a9b77f8e763f50b26301c28d570"} Nov 22 09:40:21 crc kubenswrapper[4846]: I1122 09:40:21.478240 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" event={"ID":"ee2ff4f5-0353-438b-850b-81b49a3d22ad","Type":"ContainerStarted","Data":"8b1b9bf65d71dcbe65b896c064cebf3d2cbd3af61570f1db5855f5d8fc5e1110"} Nov 22 09:40:21 crc kubenswrapper[4846]: I1122 09:40:21.519338 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" podStartSLOduration=1.9987842329999999 podStartE2EDuration="2.519313257s" podCreationTimestamp="2025-11-22 09:40:19 +0000 UTC" firstStartedPulling="2025-11-22 09:40:20.527069226 +0000 UTC m=+1595.462758915" lastFinishedPulling="2025-11-22 09:40:21.04759825 +0000 UTC m=+1595.983287939" observedRunningTime="2025-11-22 09:40:21.508815534 +0000 UTC m=+1596.444505213" watchObservedRunningTime="2025-11-22 09:40:21.519313257 +0000 UTC m=+1596.455002926" Nov 22 09:40:28 crc kubenswrapper[4846]: I1122 09:40:28.625943 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:40:28 crc kubenswrapper[4846]: I1122 09:40:28.626867 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:40:28 crc kubenswrapper[4846]: I1122 09:40:28.626949 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:40:28 crc kubenswrapper[4846]: I1122 09:40:28.628294 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:40:28 crc kubenswrapper[4846]: I1122 09:40:28.628525 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" gracePeriod=600 Nov 22 09:40:28 crc kubenswrapper[4846]: E1122 09:40:28.774776 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:40:29 crc kubenswrapper[4846]: I1122 09:40:29.597307 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" exitCode=0 Nov 22 09:40:29 crc kubenswrapper[4846]: I1122 09:40:29.597369 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b"} Nov 22 09:40:29 crc kubenswrapper[4846]: I1122 09:40:29.597786 4846 scope.go:117] "RemoveContainer" containerID="a14897a35386470f39071e84723014ffd191c85c1c0f4368970f8ed940d4ab69" Nov 22 09:40:29 crc kubenswrapper[4846]: I1122 09:40:29.599529 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:40:29 crc kubenswrapper[4846]: E1122 09:40:29.599930 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.036116 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:40:40 crc kubenswrapper[4846]: E1122 09:40:40.037155 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.080387 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dfac-account-create-jx6r6"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.091550 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-l2vdd"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.102585 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-scf94"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.109254 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-l2vdd"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.116834 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dfac-account-create-jx6r6"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.123238 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-scf94"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.129215 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4809-account-create-tsnms"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.136256 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-c9qdj"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.144726 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-15ca-account-create-4jszs"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.153475 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4809-account-create-tsnms"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.160509 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-15ca-account-create-4jszs"] Nov 22 09:40:40 crc kubenswrapper[4846]: I1122 09:40:40.167711 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-c9qdj"] Nov 22 09:40:42 crc kubenswrapper[4846]: I1122 09:40:42.064185 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5" path="/var/lib/kubelet/pods/5d63f9ef-2da3-4b25-8a16-b7bbeba5c2d5/volumes" Nov 22 09:40:42 crc kubenswrapper[4846]: I1122 09:40:42.066884 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ebb80d-1c77-4582-a075-78376fe2c7dd" path="/var/lib/kubelet/pods/70ebb80d-1c77-4582-a075-78376fe2c7dd/volumes" Nov 22 09:40:42 crc kubenswrapper[4846]: I1122 09:40:42.068335 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7209d158-55d3-457e-a685-83d7a82fb290" path="/var/lib/kubelet/pods/7209d158-55d3-457e-a685-83d7a82fb290/volumes" Nov 22 09:40:42 crc kubenswrapper[4846]: I1122 09:40:42.069636 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bc7919-add5-46fe-ab1b-26b7b3e114de" path="/var/lib/kubelet/pods/d0bc7919-add5-46fe-ab1b-26b7b3e114de/volumes" Nov 22 09:40:42 crc kubenswrapper[4846]: I1122 09:40:42.073027 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3695d7a-eba8-4780-9995-e47e5989da34" path="/var/lib/kubelet/pods/e3695d7a-eba8-4780-9995-e47e5989da34/volumes" Nov 22 09:40:42 crc kubenswrapper[4846]: I1122 09:40:42.074389 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eddb06eb-36f7-48ba-acbc-b2129ca2b43d" path="/var/lib/kubelet/pods/eddb06eb-36f7-48ba-acbc-b2129ca2b43d/volumes" Nov 22 09:40:54 crc kubenswrapper[4846]: I1122 09:40:54.035993 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:40:54 crc kubenswrapper[4846]: E1122 09:40:54.037546 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.087760 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ed51-account-create-rvllg"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.095989 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-t2mkn"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.107951 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8zbc6"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.116810 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mdpsq"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.123307 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mdpsq"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.129358 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ed51-account-create-rvllg"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.137088 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-t2mkn"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.144401 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8zbc6"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.162181 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fa87-account-create-nn2fj"] Nov 22 09:41:05 crc kubenswrapper[4846]: I1122 09:41:05.168988 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fa87-account-create-nn2fj"] Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.049113 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e293562-6940-4e74-90c4-a57ba16599ef" path="/var/lib/kubelet/pods/1e293562-6940-4e74-90c4-a57ba16599ef/volumes" Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.049672 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2cb86a-7325-4151-9b4c-b8af3060b82a" path="/var/lib/kubelet/pods/2b2cb86a-7325-4151-9b4c-b8af3060b82a/volumes" Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.050254 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="413f1d30-1d47-47b2-a954-91b8ed0134f3" path="/var/lib/kubelet/pods/413f1d30-1d47-47b2-a954-91b8ed0134f3/volumes" Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.050799 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792e14dc-fdaa-4ea2-a71d-6bef55b41871" path="/var/lib/kubelet/pods/792e14dc-fdaa-4ea2-a71d-6bef55b41871/volumes" Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.051931 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0e4040-2d4f-423f-8540-368a4c49bd74" path="/var/lib/kubelet/pods/ba0e4040-2d4f-423f-8540-368a4c49bd74/volumes" Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.052531 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0e57-account-create-sbb6n"] Nov 22 09:41:06 crc kubenswrapper[4846]: I1122 09:41:06.057646 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0e57-account-create-sbb6n"] Nov 22 09:41:08 crc kubenswrapper[4846]: I1122 09:41:08.055884 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec8aed9-00f5-4a29-af65-f87ab06bdda5" path="/var/lib/kubelet/pods/9ec8aed9-00f5-4a29-af65-f87ab06bdda5/volumes" Nov 22 09:41:09 crc kubenswrapper[4846]: I1122 09:41:09.037337 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:41:09 crc kubenswrapper[4846]: E1122 09:41:09.038318 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:41:11 crc kubenswrapper[4846]: I1122 09:41:11.047340 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sdgn9"] Nov 22 09:41:11 crc kubenswrapper[4846]: I1122 09:41:11.063013 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sdgn9"] Nov 22 09:41:12 crc kubenswrapper[4846]: I1122 09:41:12.054232 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf68f3c4-7d31-4738-8a2f-97e24d184a29" path="/var/lib/kubelet/pods/bf68f3c4-7d31-4738-8a2f-97e24d184a29/volumes" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.251555 4846 scope.go:117] "RemoveContainer" containerID="4b5f0b985023954e84079c6d18a6d0b439d3df08561d980739456063aeaf0b62" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.295014 4846 scope.go:117] "RemoveContainer" containerID="b8ce0bf5fdf5fb7df3baaeb39a7a5691b8bae7beaf3f7aba54c339d0dd4ac80f" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.332661 4846 scope.go:117] "RemoveContainer" containerID="86d3518959e308545599333a51e77cd235a28178a8e9f68826241bf425a0a446" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.371655 4846 scope.go:117] "RemoveContainer" containerID="7d6a1861e0eddf446083dd7189b30b4479b51e25f0f5860f3ebf71781945b3f7" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.407102 4846 scope.go:117] "RemoveContainer" containerID="f39cac436f676610a98305350fcf4bccf58087cdd93c19debe4295b147e4fa20" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.451633 4846 scope.go:117] "RemoveContainer" containerID="32497e82c613baa575e2095e88dfd21377d9256d60c37c294b6386179b552051" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.493295 4846 scope.go:117] "RemoveContainer" containerID="f249fe84a9c2f30e2d5ecec9b5b3cbeb411fba19b8045fe57eec7f3aaad3dcbd" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.520612 4846 scope.go:117] "RemoveContainer" containerID="ed4a0c24553ebc438a1494550f18b15903db25b9b2741f7c1a482648530cc12a" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.546984 4846 scope.go:117] "RemoveContainer" containerID="5b1dbd0c28487979c863dcb19ba83b7ddd62b6a7b5fea1a2c8d8ad769274c537" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.571036 4846 scope.go:117] "RemoveContainer" containerID="9364d643445c4a1c9cb1400f2ee8b1d0aeff33c8e3b586cbcb2bd398ced15413" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.592925 4846 scope.go:117] "RemoveContainer" containerID="36802b5be6a63763b03548bdc20aa224351a9117d7dd2fb0392b7d5d0b1c2a64" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.621718 4846 scope.go:117] "RemoveContainer" containerID="61733e54616f04475ee736b0b982cbe49c948a4518fb2dd424d1dae3f6529ba3" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.663878 4846 scope.go:117] "RemoveContainer" containerID="6a99b492d417f7aa9dbf48002719dc4cbc9c84d5be47aee1b141fc93d6319fbc" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.695014 4846 scope.go:117] "RemoveContainer" containerID="97adf0ffd581501bf454dbee296281151870069ea39674c160a4af6904ce4ccf" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.734533 4846 scope.go:117] "RemoveContainer" containerID="9a32b28148a5a55c01240a63a00a3a5bc2ac21f11a4d8adf94ebf5b5bd17d091" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.761932 4846 scope.go:117] "RemoveContainer" containerID="04533a3855c349d95bfaf88aa0774798a6e464da7aaf4f31703417377346ff0a" Nov 22 09:41:13 crc kubenswrapper[4846]: I1122 09:41:13.782567 4846 scope.go:117] "RemoveContainer" containerID="92a9703249142e8c1f9cda32fa8b94bbe087b4af2bca45f48d837c10559628ba" Nov 22 09:41:15 crc kubenswrapper[4846]: I1122 09:41:15.041365 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4nttj"] Nov 22 09:41:15 crc kubenswrapper[4846]: I1122 09:41:15.058330 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4nttj"] Nov 22 09:41:16 crc kubenswrapper[4846]: I1122 09:41:16.054847 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175f1421-f4ac-4bc9-b3f6-fa5860f556b4" path="/var/lib/kubelet/pods/175f1421-f4ac-4bc9-b3f6-fa5860f556b4/volumes" Nov 22 09:41:24 crc kubenswrapper[4846]: I1122 09:41:24.036287 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:41:24 crc kubenswrapper[4846]: E1122 09:41:24.037365 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:41:35 crc kubenswrapper[4846]: I1122 09:41:35.591218 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:41:35 crc kubenswrapper[4846]: E1122 09:41:35.592478 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:41:48 crc kubenswrapper[4846]: I1122 09:41:48.034839 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:41:48 crc kubenswrapper[4846]: E1122 09:41:48.035621 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:41:53 crc kubenswrapper[4846]: I1122 09:41:53.832186 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee2ff4f5-0353-438b-850b-81b49a3d22ad" containerID="dbf277bd085e0eb994f1a0703c1f31d93ee95a9b77f8e763f50b26301c28d570" exitCode=0 Nov 22 09:41:53 crc kubenswrapper[4846]: I1122 09:41:53.832299 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" event={"ID":"ee2ff4f5-0353-438b-850b-81b49a3d22ad","Type":"ContainerDied","Data":"dbf277bd085e0eb994f1a0703c1f31d93ee95a9b77f8e763f50b26301c28d570"} Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.445270 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.527386 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-inventory\") pod \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.527488 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc5fj\" (UniqueName: \"kubernetes.io/projected/ee2ff4f5-0353-438b-850b-81b49a3d22ad-kube-api-access-dc5fj\") pod \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.527536 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-ssh-key\") pod \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\" (UID: \"ee2ff4f5-0353-438b-850b-81b49a3d22ad\") " Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.535957 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2ff4f5-0353-438b-850b-81b49a3d22ad-kube-api-access-dc5fj" (OuterVolumeSpecName: "kube-api-access-dc5fj") pod "ee2ff4f5-0353-438b-850b-81b49a3d22ad" (UID: "ee2ff4f5-0353-438b-850b-81b49a3d22ad"). InnerVolumeSpecName "kube-api-access-dc5fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.566087 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee2ff4f5-0353-438b-850b-81b49a3d22ad" (UID: "ee2ff4f5-0353-438b-850b-81b49a3d22ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.579071 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-inventory" (OuterVolumeSpecName: "inventory") pod "ee2ff4f5-0353-438b-850b-81b49a3d22ad" (UID: "ee2ff4f5-0353-438b-850b-81b49a3d22ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.630377 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.630412 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc5fj\" (UniqueName: \"kubernetes.io/projected/ee2ff4f5-0353-438b-850b-81b49a3d22ad-kube-api-access-dc5fj\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.630424 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee2ff4f5-0353-438b-850b-81b49a3d22ad-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.863788 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.863651 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-grpjk" event={"ID":"ee2ff4f5-0353-438b-850b-81b49a3d22ad","Type":"ContainerDied","Data":"8b1b9bf65d71dcbe65b896c064cebf3d2cbd3af61570f1db5855f5d8fc5e1110"} Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.865138 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1b9bf65d71dcbe65b896c064cebf3d2cbd3af61570f1db5855f5d8fc5e1110" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.947956 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp"] Nov 22 09:41:55 crc kubenswrapper[4846]: E1122 09:41:55.948403 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2ff4f5-0353-438b-850b-81b49a3d22ad" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.948418 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2ff4f5-0353-438b-850b-81b49a3d22ad" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.948673 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2ff4f5-0353-438b-850b-81b49a3d22ad" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.949517 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.951727 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.951862 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.955132 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.955175 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:41:55 crc kubenswrapper[4846]: I1122 09:41:55.961787 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp"] Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.040370 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.040437 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.040471 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbfr\" (UniqueName: \"kubernetes.io/projected/4e8248db-f0c2-40ad-a534-e3076fae3466-kube-api-access-cqbfr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.144524 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.144599 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.144639 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbfr\" (UniqueName: \"kubernetes.io/projected/4e8248db-f0c2-40ad-a534-e3076fae3466-kube-api-access-cqbfr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.150663 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.151831 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.169038 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbfr\" (UniqueName: \"kubernetes.io/projected/4e8248db-f0c2-40ad-a534-e3076fae3466-kube-api-access-cqbfr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-crlnp\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.270003 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.723383 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp"] Nov 22 09:41:56 crc kubenswrapper[4846]: I1122 09:41:56.876504 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" event={"ID":"4e8248db-f0c2-40ad-a534-e3076fae3466","Type":"ContainerStarted","Data":"9963a9f540deae3223f82381cbedec4ee1de3a5ab4739b568e5b2c0c22793cfc"} Nov 22 09:41:57 crc kubenswrapper[4846]: I1122 09:41:57.889139 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" event={"ID":"4e8248db-f0c2-40ad-a534-e3076fae3466","Type":"ContainerStarted","Data":"5eee8946aa6deca3672aebb0e8d9ea41dae26ce6acb8a1fb34546f3055ecce5a"} Nov 22 09:41:57 crc kubenswrapper[4846]: I1122 09:41:57.911705 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" podStartSLOduration=2.225267949 podStartE2EDuration="2.911689081s" podCreationTimestamp="2025-11-22 09:41:55 +0000 UTC" firstStartedPulling="2025-11-22 09:41:56.728777799 +0000 UTC m=+1691.664467468" lastFinishedPulling="2025-11-22 09:41:57.415198951 +0000 UTC m=+1692.350888600" observedRunningTime="2025-11-22 09:41:57.910090819 +0000 UTC m=+1692.845780468" watchObservedRunningTime="2025-11-22 09:41:57.911689081 +0000 UTC m=+1692.847378730" Nov 22 09:41:59 crc kubenswrapper[4846]: I1122 09:41:59.036981 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:41:59 crc kubenswrapper[4846]: E1122 09:41:59.038108 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:42:10 crc kubenswrapper[4846]: I1122 09:42:10.036296 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:42:10 crc kubenswrapper[4846]: E1122 09:42:10.037895 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:42:14 crc kubenswrapper[4846]: I1122 09:42:14.169242 4846 scope.go:117] "RemoveContainer" containerID="734062ff9482717a76d935520e9bc3ce7897edcfa111add693f9048416d5c458" Nov 22 09:42:24 crc kubenswrapper[4846]: I1122 09:42:24.057748 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wjc4l"] Nov 22 09:42:24 crc kubenswrapper[4846]: I1122 09:42:24.070806 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wjc4l"] Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.042712 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:42:25 crc kubenswrapper[4846]: E1122 09:42:25.043720 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.049607 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xh58j"] Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.064318 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xh58j"] Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.757130 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zs4mp"] Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.759634 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.778263 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95r4p\" (UniqueName: \"kubernetes.io/projected/7657f27c-78dc-44d0-b809-f3e574f561a1-kube-api-access-95r4p\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.778344 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-utilities\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.778532 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-catalog-content\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.782631 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zs4mp"] Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.880947 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95r4p\" (UniqueName: \"kubernetes.io/projected/7657f27c-78dc-44d0-b809-f3e574f561a1-kube-api-access-95r4p\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.881011 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-utilities\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.881062 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-catalog-content\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.881703 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-catalog-content\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.882337 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-utilities\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:25 crc kubenswrapper[4846]: I1122 09:42:25.904298 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95r4p\" (UniqueName: \"kubernetes.io/projected/7657f27c-78dc-44d0-b809-f3e574f561a1-kube-api-access-95r4p\") pod \"certified-operators-zs4mp\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:26 crc kubenswrapper[4846]: I1122 09:42:26.054982 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c8a1c0-2155-4d68-971a-e68aff9e5133" path="/var/lib/kubelet/pods/49c8a1c0-2155-4d68-971a-e68aff9e5133/volumes" Nov 22 09:42:26 crc kubenswrapper[4846]: I1122 09:42:26.056144 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc88a2ff-257e-4a2b-81b5-e35f78e77a1d" path="/var/lib/kubelet/pods/cc88a2ff-257e-4a2b-81b5-e35f78e77a1d/volumes" Nov 22 09:42:26 crc kubenswrapper[4846]: I1122 09:42:26.099946 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:26 crc kubenswrapper[4846]: I1122 09:42:26.628292 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zs4mp"] Nov 22 09:42:27 crc kubenswrapper[4846]: I1122 09:42:27.248515 4846 generic.go:334] "Generic (PLEG): container finished" podID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerID="67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543" exitCode=0 Nov 22 09:42:27 crc kubenswrapper[4846]: I1122 09:42:27.248597 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerDied","Data":"67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543"} Nov 22 09:42:27 crc kubenswrapper[4846]: I1122 09:42:27.248646 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerStarted","Data":"ce63946788fbc70eb94f3e44853ad1bf23af8420b603a4e949e58dd0b97d937a"} Nov 22 09:42:28 crc kubenswrapper[4846]: I1122 09:42:28.271813 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerStarted","Data":"b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e"} Nov 22 09:42:29 crc kubenswrapper[4846]: I1122 09:42:29.287599 4846 generic.go:334] "Generic (PLEG): container finished" podID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerID="b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e" exitCode=0 Nov 22 09:42:29 crc kubenswrapper[4846]: I1122 09:42:29.287755 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerDied","Data":"b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e"} Nov 22 09:42:30 crc kubenswrapper[4846]: I1122 09:42:30.303221 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerStarted","Data":"061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd"} Nov 22 09:42:30 crc kubenswrapper[4846]: I1122 09:42:30.364226 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zs4mp" podStartSLOduration=2.807660987 podStartE2EDuration="5.364203802s" podCreationTimestamp="2025-11-22 09:42:25 +0000 UTC" firstStartedPulling="2025-11-22 09:42:27.251167809 +0000 UTC m=+1722.186857468" lastFinishedPulling="2025-11-22 09:42:29.807710624 +0000 UTC m=+1724.743400283" observedRunningTime="2025-11-22 09:42:30.362314063 +0000 UTC m=+1725.298003752" watchObservedRunningTime="2025-11-22 09:42:30.364203802 +0000 UTC m=+1725.299893461" Nov 22 09:42:31 crc kubenswrapper[4846]: I1122 09:42:31.030273 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mtz29"] Nov 22 09:42:31 crc kubenswrapper[4846]: I1122 09:42:31.041483 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mtz29"] Nov 22 09:42:32 crc kubenswrapper[4846]: I1122 09:42:32.050361 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb094f7c-1527-476d-bf4a-d54a022320d0" path="/var/lib/kubelet/pods/bb094f7c-1527-476d-bf4a-d54a022320d0/volumes" Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.104395 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.105113 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.164398 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.783295 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xgp99"] Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.794523 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xgp99"] Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.833627 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:36 crc kubenswrapper[4846]: I1122 09:42:36.960006 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zs4mp"] Nov 22 09:42:37 crc kubenswrapper[4846]: I1122 09:42:37.041998 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9m5n9"] Nov 22 09:42:37 crc kubenswrapper[4846]: I1122 09:42:37.057822 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9m5n9"] Nov 22 09:42:38 crc kubenswrapper[4846]: I1122 09:42:38.057154 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="083da0b8-38d6-4eab-b211-8389df97a0a8" path="/var/lib/kubelet/pods/083da0b8-38d6-4eab-b211-8389df97a0a8/volumes" Nov 22 09:42:38 crc kubenswrapper[4846]: I1122 09:42:38.058560 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584aeb0f-b1a9-4a6e-b129-b21593065b18" path="/var/lib/kubelet/pods/584aeb0f-b1a9-4a6e-b129-b21593065b18/volumes" Nov 22 09:42:38 crc kubenswrapper[4846]: I1122 09:42:38.784257 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zs4mp" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="registry-server" containerID="cri-o://061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd" gracePeriod=2 Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.742365 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.797265 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zs4mp" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.797351 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerDied","Data":"061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd"} Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.797384 4846 generic.go:334] "Generic (PLEG): container finished" podID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerID="061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd" exitCode=0 Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.797420 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zs4mp" event={"ID":"7657f27c-78dc-44d0-b809-f3e574f561a1","Type":"ContainerDied","Data":"ce63946788fbc70eb94f3e44853ad1bf23af8420b603a4e949e58dd0b97d937a"} Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.797444 4846 scope.go:117] "RemoveContainer" containerID="061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.821791 4846 scope.go:117] "RemoveContainer" containerID="b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.845600 4846 scope.go:117] "RemoveContainer" containerID="67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.885842 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-catalog-content\") pod \"7657f27c-78dc-44d0-b809-f3e574f561a1\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.886023 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-utilities\") pod \"7657f27c-78dc-44d0-b809-f3e574f561a1\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.886267 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95r4p\" (UniqueName: \"kubernetes.io/projected/7657f27c-78dc-44d0-b809-f3e574f561a1-kube-api-access-95r4p\") pod \"7657f27c-78dc-44d0-b809-f3e574f561a1\" (UID: \"7657f27c-78dc-44d0-b809-f3e574f561a1\") " Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.887801 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-utilities" (OuterVolumeSpecName: "utilities") pod "7657f27c-78dc-44d0-b809-f3e574f561a1" (UID: "7657f27c-78dc-44d0-b809-f3e574f561a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.894623 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7657f27c-78dc-44d0-b809-f3e574f561a1-kube-api-access-95r4p" (OuterVolumeSpecName: "kube-api-access-95r4p") pod "7657f27c-78dc-44d0-b809-f3e574f561a1" (UID: "7657f27c-78dc-44d0-b809-f3e574f561a1"). InnerVolumeSpecName "kube-api-access-95r4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.913662 4846 scope.go:117] "RemoveContainer" containerID="061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd" Nov 22 09:42:39 crc kubenswrapper[4846]: E1122 09:42:39.914294 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd\": container with ID starting with 061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd not found: ID does not exist" containerID="061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.914344 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd"} err="failed to get container status \"061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd\": rpc error: code = NotFound desc = could not find container \"061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd\": container with ID starting with 061214583ac6727dd69171a19b4e723dda62b374846087cbff50e1493ec73bcd not found: ID does not exist" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.914380 4846 scope.go:117] "RemoveContainer" containerID="b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e" Nov 22 09:42:39 crc kubenswrapper[4846]: E1122 09:42:39.914737 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e\": container with ID starting with b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e not found: ID does not exist" containerID="b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.914802 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e"} err="failed to get container status \"b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e\": rpc error: code = NotFound desc = could not find container \"b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e\": container with ID starting with b217c8c50410128cabefc4de03e2e07b853a78095a364812d1151b8a96a1fd9e not found: ID does not exist" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.914849 4846 scope.go:117] "RemoveContainer" containerID="67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543" Nov 22 09:42:39 crc kubenswrapper[4846]: E1122 09:42:39.915213 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543\": container with ID starting with 67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543 not found: ID does not exist" containerID="67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.915258 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543"} err="failed to get container status \"67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543\": rpc error: code = NotFound desc = could not find container \"67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543\": container with ID starting with 67f230fa13217d7574ec2ed2530722da2c1ee0cd00bdbb26a6936d2c94984543 not found: ID does not exist" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.976841 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7657f27c-78dc-44d0-b809-f3e574f561a1" (UID: "7657f27c-78dc-44d0-b809-f3e574f561a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.989446 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95r4p\" (UniqueName: \"kubernetes.io/projected/7657f27c-78dc-44d0-b809-f3e574f561a1-kube-api-access-95r4p\") on node \"crc\" DevicePath \"\"" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.989476 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:42:39 crc kubenswrapper[4846]: I1122 09:42:39.989488 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7657f27c-78dc-44d0-b809-f3e574f561a1-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:42:40 crc kubenswrapper[4846]: I1122 09:42:40.036248 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:42:40 crc kubenswrapper[4846]: E1122 09:42:40.036465 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:42:40 crc kubenswrapper[4846]: I1122 09:42:40.150195 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zs4mp"] Nov 22 09:42:40 crc kubenswrapper[4846]: I1122 09:42:40.169637 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zs4mp"] Nov 22 09:42:40 crc kubenswrapper[4846]: E1122 09:42:40.254800 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7657f27c_78dc_44d0_b809_f3e574f561a1.slice/crio-ce63946788fbc70eb94f3e44853ad1bf23af8420b603a4e949e58dd0b97d937a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7657f27c_78dc_44d0_b809_f3e574f561a1.slice\": RecentStats: unable to find data in memory cache]" Nov 22 09:42:42 crc kubenswrapper[4846]: I1122 09:42:42.054172 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" path="/var/lib/kubelet/pods/7657f27c-78dc-44d0-b809-f3e574f561a1/volumes" Nov 22 09:42:53 crc kubenswrapper[4846]: I1122 09:42:53.036661 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:42:53 crc kubenswrapper[4846]: E1122 09:42:53.038116 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:43:07 crc kubenswrapper[4846]: I1122 09:43:07.036184 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:43:07 crc kubenswrapper[4846]: E1122 09:43:07.037549 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:43:14 crc kubenswrapper[4846]: I1122 09:43:14.272082 4846 scope.go:117] "RemoveContainer" containerID="3750c0f79fd15b824c4a981dda2963917b77fa7732dee70a171168e51b897084" Nov 22 09:43:14 crc kubenswrapper[4846]: I1122 09:43:14.344230 4846 scope.go:117] "RemoveContainer" containerID="ed3a9f30db2eafb70efa74708796a5be2d4b8ad98a9cadf4f36e780439416ac2" Nov 22 09:43:14 crc kubenswrapper[4846]: I1122 09:43:14.403794 4846 scope.go:117] "RemoveContainer" containerID="5672aca4aadda66f02b97cdd31f0cb5bde14c5316521dcf7ace45e695c6e8ab7" Nov 22 09:43:14 crc kubenswrapper[4846]: I1122 09:43:14.440738 4846 scope.go:117] "RemoveContainer" containerID="4f85422a8125cdaf852b99a918947ac6379240d75c79121f17a573a8bb7927ce" Nov 22 09:43:14 crc kubenswrapper[4846]: I1122 09:43:14.494956 4846 scope.go:117] "RemoveContainer" containerID="0786d84e5c9d01386fe2c06bfab25d155d8a0d396f061bc48524ba5448031ebd" Nov 22 09:43:18 crc kubenswrapper[4846]: I1122 09:43:18.036910 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:43:18 crc kubenswrapper[4846]: E1122 09:43:18.038297 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:43:29 crc kubenswrapper[4846]: I1122 09:43:29.035811 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:43:29 crc kubenswrapper[4846]: E1122 09:43:29.037169 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.080286 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t4dm2"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.091397 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a996-account-create-rxphj"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.102104 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zkzjs"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.111487 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tgvtl"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.119684 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5847-account-create-qm7bq"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.126138 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5dae-account-create-zzbjp"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.131821 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5847-account-create-qm7bq"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.138086 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t4dm2"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.144851 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5dae-account-create-zzbjp"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.152416 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tgvtl"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.158987 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a996-account-create-rxphj"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.165111 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zkzjs"] Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.481545 4846 generic.go:334] "Generic (PLEG): container finished" podID="4e8248db-f0c2-40ad-a534-e3076fae3466" containerID="5eee8946aa6deca3672aebb0e8d9ea41dae26ce6acb8a1fb34546f3055ecce5a" exitCode=0 Nov 22 09:43:30 crc kubenswrapper[4846]: I1122 09:43:30.481636 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" event={"ID":"4e8248db-f0c2-40ad-a534-e3076fae3466","Type":"ContainerDied","Data":"5eee8946aa6deca3672aebb0e8d9ea41dae26ce6acb8a1fb34546f3055ecce5a"} Nov 22 09:43:31 crc kubenswrapper[4846]: I1122 09:43:31.983482 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:43:31 crc kubenswrapper[4846]: I1122 09:43:31.990393 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-inventory\") pod \"4e8248db-f0c2-40ad-a534-e3076fae3466\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " Nov 22 09:43:31 crc kubenswrapper[4846]: I1122 09:43:31.990503 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqbfr\" (UniqueName: \"kubernetes.io/projected/4e8248db-f0c2-40ad-a534-e3076fae3466-kube-api-access-cqbfr\") pod \"4e8248db-f0c2-40ad-a534-e3076fae3466\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " Nov 22 09:43:31 crc kubenswrapper[4846]: I1122 09:43:31.990648 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-ssh-key\") pod \"4e8248db-f0c2-40ad-a534-e3076fae3466\" (UID: \"4e8248db-f0c2-40ad-a534-e3076fae3466\") " Nov 22 09:43:31 crc kubenswrapper[4846]: I1122 09:43:31.999176 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8248db-f0c2-40ad-a534-e3076fae3466-kube-api-access-cqbfr" (OuterVolumeSpecName: "kube-api-access-cqbfr") pod "4e8248db-f0c2-40ad-a534-e3076fae3466" (UID: "4e8248db-f0c2-40ad-a534-e3076fae3466"). InnerVolumeSpecName "kube-api-access-cqbfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.031782 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-inventory" (OuterVolumeSpecName: "inventory") pod "4e8248db-f0c2-40ad-a534-e3076fae3466" (UID: "4e8248db-f0c2-40ad-a534-e3076fae3466"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.047738 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150b9c82-e37f-41c2-a4ee-1578b73f9826" path="/var/lib/kubelet/pods/150b9c82-e37f-41c2-a4ee-1578b73f9826/volumes" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.048251 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e8248db-f0c2-40ad-a534-e3076fae3466" (UID: "4e8248db-f0c2-40ad-a534-e3076fae3466"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.048690 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3ec915-c24b-4f67-8184-d21fd9a91e32" path="/var/lib/kubelet/pods/3e3ec915-c24b-4f67-8184-d21fd9a91e32/volumes" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.049391 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483e6e5b-838d-4fc3-ae1a-82ac6ba13439" path="/var/lib/kubelet/pods/483e6e5b-838d-4fc3-ae1a-82ac6ba13439/volumes" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.050015 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b4db2e-84b4-474a-90d0-0b9ee78e122f" path="/var/lib/kubelet/pods/a7b4db2e-84b4-474a-90d0-0b9ee78e122f/volumes" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.051113 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be978622-0c3f-41d3-b518-bbfcc1254b15" path="/var/lib/kubelet/pods/be978622-0c3f-41d3-b518-bbfcc1254b15/volumes" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.051683 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a48516-03fd-4e58-9f00-588f82223270" path="/var/lib/kubelet/pods/c8a48516-03fd-4e58-9f00-588f82223270/volumes" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.092978 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.093014 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqbfr\" (UniqueName: \"kubernetes.io/projected/4e8248db-f0c2-40ad-a534-e3076fae3466-kube-api-access-cqbfr\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.093025 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e8248db-f0c2-40ad-a534-e3076fae3466-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.504867 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" event={"ID":"4e8248db-f0c2-40ad-a534-e3076fae3466","Type":"ContainerDied","Data":"9963a9f540deae3223f82381cbedec4ee1de3a5ab4739b568e5b2c0c22793cfc"} Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.504931 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9963a9f540deae3223f82381cbedec4ee1de3a5ab4739b568e5b2c0c22793cfc" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.504966 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-crlnp" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.636974 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz"] Nov 22 09:43:32 crc kubenswrapper[4846]: E1122 09:43:32.637439 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8248db-f0c2-40ad-a534-e3076fae3466" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.637461 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8248db-f0c2-40ad-a534-e3076fae3466" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 09:43:32 crc kubenswrapper[4846]: E1122 09:43:32.637486 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="registry-server" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.637495 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="registry-server" Nov 22 09:43:32 crc kubenswrapper[4846]: E1122 09:43:32.637521 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="extract-utilities" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.637529 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="extract-utilities" Nov 22 09:43:32 crc kubenswrapper[4846]: E1122 09:43:32.637548 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="extract-content" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.637557 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="extract-content" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.637766 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8248db-f0c2-40ad-a534-e3076fae3466" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.637786 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="7657f27c-78dc-44d0-b809-f3e574f561a1" containerName="registry-server" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.638546 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.640530 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.640734 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.641135 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.643598 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.662453 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz"] Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.805008 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.805132 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rmc\" (UniqueName: \"kubernetes.io/projected/cfeec82e-6d58-4819-8715-7d0febbe480c-kube-api-access-p6rmc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.805261 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.907771 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.907849 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rmc\" (UniqueName: \"kubernetes.io/projected/cfeec82e-6d58-4819-8715-7d0febbe480c-kube-api-access-p6rmc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.907945 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.915652 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.918398 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.929430 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rmc\" (UniqueName: \"kubernetes.io/projected/cfeec82e-6d58-4819-8715-7d0febbe480c-kube-api-access-p6rmc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:32 crc kubenswrapper[4846]: I1122 09:43:32.961942 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:33 crc kubenswrapper[4846]: I1122 09:43:33.591271 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz"] Nov 22 09:43:34 crc kubenswrapper[4846]: I1122 09:43:34.537141 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" event={"ID":"cfeec82e-6d58-4819-8715-7d0febbe480c","Type":"ContainerStarted","Data":"a96cafd339d9617809a00ce92419b6257f06997578008093a88ab4ff7757ea71"} Nov 22 09:43:34 crc kubenswrapper[4846]: I1122 09:43:34.537428 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" event={"ID":"cfeec82e-6d58-4819-8715-7d0febbe480c","Type":"ContainerStarted","Data":"b30636a94dddf477db4c2beaae60b654df750917e3f72216387153b5d893318a"} Nov 22 09:43:34 crc kubenswrapper[4846]: I1122 09:43:34.565388 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" podStartSLOduration=2.083337388 podStartE2EDuration="2.565368167s" podCreationTimestamp="2025-11-22 09:43:32 +0000 UTC" firstStartedPulling="2025-11-22 09:43:33.605327735 +0000 UTC m=+1788.541017394" lastFinishedPulling="2025-11-22 09:43:34.087358524 +0000 UTC m=+1789.023048173" observedRunningTime="2025-11-22 09:43:34.559334213 +0000 UTC m=+1789.495023872" watchObservedRunningTime="2025-11-22 09:43:34.565368167 +0000 UTC m=+1789.501057826" Nov 22 09:43:39 crc kubenswrapper[4846]: I1122 09:43:39.603783 4846 generic.go:334] "Generic (PLEG): container finished" podID="cfeec82e-6d58-4819-8715-7d0febbe480c" containerID="a96cafd339d9617809a00ce92419b6257f06997578008093a88ab4ff7757ea71" exitCode=0 Nov 22 09:43:39 crc kubenswrapper[4846]: I1122 09:43:39.603893 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" event={"ID":"cfeec82e-6d58-4819-8715-7d0febbe480c","Type":"ContainerDied","Data":"a96cafd339d9617809a00ce92419b6257f06997578008093a88ab4ff7757ea71"} Nov 22 09:43:40 crc kubenswrapper[4846]: I1122 09:43:40.036229 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:43:40 crc kubenswrapper[4846]: E1122 09:43:40.036947 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.064188 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.224174 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-inventory\") pod \"cfeec82e-6d58-4819-8715-7d0febbe480c\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.224384 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rmc\" (UniqueName: \"kubernetes.io/projected/cfeec82e-6d58-4819-8715-7d0febbe480c-kube-api-access-p6rmc\") pod \"cfeec82e-6d58-4819-8715-7d0febbe480c\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.224439 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-ssh-key\") pod \"cfeec82e-6d58-4819-8715-7d0febbe480c\" (UID: \"cfeec82e-6d58-4819-8715-7d0febbe480c\") " Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.230506 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfeec82e-6d58-4819-8715-7d0febbe480c-kube-api-access-p6rmc" (OuterVolumeSpecName: "kube-api-access-p6rmc") pod "cfeec82e-6d58-4819-8715-7d0febbe480c" (UID: "cfeec82e-6d58-4819-8715-7d0febbe480c"). InnerVolumeSpecName "kube-api-access-p6rmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.269737 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cfeec82e-6d58-4819-8715-7d0febbe480c" (UID: "cfeec82e-6d58-4819-8715-7d0febbe480c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.272534 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-inventory" (OuterVolumeSpecName: "inventory") pod "cfeec82e-6d58-4819-8715-7d0febbe480c" (UID: "cfeec82e-6d58-4819-8715-7d0febbe480c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.328138 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6rmc\" (UniqueName: \"kubernetes.io/projected/cfeec82e-6d58-4819-8715-7d0febbe480c-kube-api-access-p6rmc\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.328199 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.328227 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfeec82e-6d58-4819-8715-7d0febbe480c-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.633160 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" event={"ID":"cfeec82e-6d58-4819-8715-7d0febbe480c","Type":"ContainerDied","Data":"b30636a94dddf477db4c2beaae60b654df750917e3f72216387153b5d893318a"} Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.633250 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b30636a94dddf477db4c2beaae60b654df750917e3f72216387153b5d893318a" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.633275 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.709037 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw"] Nov 22 09:43:41 crc kubenswrapper[4846]: E1122 09:43:41.709473 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeec82e-6d58-4819-8715-7d0febbe480c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.709491 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeec82e-6d58-4819-8715-7d0febbe480c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.709711 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfeec82e-6d58-4819-8715-7d0febbe480c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.710366 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.712850 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.713144 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.713743 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.719699 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.720483 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw"] Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.739292 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.739405 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.739442 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89fzj\" (UniqueName: \"kubernetes.io/projected/c12b9ae4-5d39-4ce1-bca3-8b128038532e-kube-api-access-89fzj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.840831 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.840950 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.840988 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89fzj\" (UniqueName: \"kubernetes.io/projected/c12b9ae4-5d39-4ce1-bca3-8b128038532e-kube-api-access-89fzj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.846675 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.849499 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:41 crc kubenswrapper[4846]: I1122 09:43:41.857728 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89fzj\" (UniqueName: \"kubernetes.io/projected/c12b9ae4-5d39-4ce1-bca3-8b128038532e-kube-api-access-89fzj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-bs5cw\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:42 crc kubenswrapper[4846]: I1122 09:43:42.048182 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:43:42 crc kubenswrapper[4846]: I1122 09:43:42.366818 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw"] Nov 22 09:43:42 crc kubenswrapper[4846]: I1122 09:43:42.642426 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" event={"ID":"c12b9ae4-5d39-4ce1-bca3-8b128038532e","Type":"ContainerStarted","Data":"cb7d1c328bf68252c474402c7b44222bb4d158198a5c8061f147acfcfa3acbff"} Nov 22 09:43:43 crc kubenswrapper[4846]: I1122 09:43:43.655103 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" event={"ID":"c12b9ae4-5d39-4ce1-bca3-8b128038532e","Type":"ContainerStarted","Data":"11e8b9709039a50382eb7b234f87589eb67c45848ef281205d7a48fe41f80a66"} Nov 22 09:43:43 crc kubenswrapper[4846]: I1122 09:43:43.688592 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" podStartSLOduration=2.299522293 podStartE2EDuration="2.688571102s" podCreationTimestamp="2025-11-22 09:43:41 +0000 UTC" firstStartedPulling="2025-11-22 09:43:42.381556723 +0000 UTC m=+1797.317246362" lastFinishedPulling="2025-11-22 09:43:42.770605502 +0000 UTC m=+1797.706295171" observedRunningTime="2025-11-22 09:43:43.676119353 +0000 UTC m=+1798.611809042" watchObservedRunningTime="2025-11-22 09:43:43.688571102 +0000 UTC m=+1798.624260761" Nov 22 09:43:53 crc kubenswrapper[4846]: I1122 09:43:53.035397 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:43:53 crc kubenswrapper[4846]: E1122 09:43:53.036547 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:43:59 crc kubenswrapper[4846]: I1122 09:43:59.061580 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fzdp"] Nov 22 09:43:59 crc kubenswrapper[4846]: I1122 09:43:59.074647 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fzdp"] Nov 22 09:44:00 crc kubenswrapper[4846]: I1122 09:44:00.055315 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9154a2c-895f-4ef4-921a-08305d1f8c4f" path="/var/lib/kubelet/pods/e9154a2c-895f-4ef4-921a-08305d1f8c4f/volumes" Nov 22 09:44:04 crc kubenswrapper[4846]: I1122 09:44:04.034919 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:44:04 crc kubenswrapper[4846]: E1122 09:44:04.035664 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.639091 4846 scope.go:117] "RemoveContainer" containerID="15b768f9e5b5689a99821d2627ea76f4efcb0c822d2151f661a0d131fe14cd7a" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.684237 4846 scope.go:117] "RemoveContainer" containerID="b085c9e72a5a9368b7096ba50e7fb2e6bf539651f2c8555c62bed68842e0a655" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.767236 4846 scope.go:117] "RemoveContainer" containerID="73eeb0ca2e6395869514fa1329b6a4d59c6cfe218f8cc81f9efaeeee170bbc6b" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.820585 4846 scope.go:117] "RemoveContainer" containerID="d36e2b03a65a697f6a604e9a2e66a36e5e023a0a5fd2d3c62c6bd4a5f4ed1047" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.868080 4846 scope.go:117] "RemoveContainer" containerID="d1d8d349e5418a002194dd19912a535461117ebfae5f89e97531b0202ba7740b" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.903601 4846 scope.go:117] "RemoveContainer" containerID="3d25922e03648ba50de7206628b4a3773f88136364b8d7161edab37f45f8e5d9" Nov 22 09:44:14 crc kubenswrapper[4846]: I1122 09:44:14.940992 4846 scope.go:117] "RemoveContainer" containerID="e4eff862f8b641426e53f1112558ad1879bdfa977c7fed124e3a5fdf0376720b" Nov 22 09:44:18 crc kubenswrapper[4846]: I1122 09:44:18.035865 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:44:18 crc kubenswrapper[4846]: E1122 09:44:18.036848 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:44:24 crc kubenswrapper[4846]: I1122 09:44:24.071474 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4bdbs"] Nov 22 09:44:24 crc kubenswrapper[4846]: I1122 09:44:24.091608 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4bdbs"] Nov 22 09:44:25 crc kubenswrapper[4846]: I1122 09:44:25.039321 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fmhx6"] Nov 22 09:44:25 crc kubenswrapper[4846]: I1122 09:44:25.048367 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fmhx6"] Nov 22 09:44:25 crc kubenswrapper[4846]: I1122 09:44:25.227492 4846 generic.go:334] "Generic (PLEG): container finished" podID="c12b9ae4-5d39-4ce1-bca3-8b128038532e" containerID="11e8b9709039a50382eb7b234f87589eb67c45848ef281205d7a48fe41f80a66" exitCode=0 Nov 22 09:44:25 crc kubenswrapper[4846]: I1122 09:44:25.227577 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" event={"ID":"c12b9ae4-5d39-4ce1-bca3-8b128038532e","Type":"ContainerDied","Data":"11e8b9709039a50382eb7b234f87589eb67c45848ef281205d7a48fe41f80a66"} Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.053714 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86cc211f-34a7-4337-9560-1b30aae9b177" path="/var/lib/kubelet/pods/86cc211f-34a7-4337-9560-1b30aae9b177/volumes" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.054807 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ece6607-f6f8-4060-9a47-0ccd9560ce96" path="/var/lib/kubelet/pods/8ece6607-f6f8-4060-9a47-0ccd9560ce96/volumes" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.755968 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.873288 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-ssh-key\") pod \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.873431 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-inventory\") pod \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.873601 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89fzj\" (UniqueName: \"kubernetes.io/projected/c12b9ae4-5d39-4ce1-bca3-8b128038532e-kube-api-access-89fzj\") pod \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\" (UID: \"c12b9ae4-5d39-4ce1-bca3-8b128038532e\") " Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.879968 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12b9ae4-5d39-4ce1-bca3-8b128038532e-kube-api-access-89fzj" (OuterVolumeSpecName: "kube-api-access-89fzj") pod "c12b9ae4-5d39-4ce1-bca3-8b128038532e" (UID: "c12b9ae4-5d39-4ce1-bca3-8b128038532e"). InnerVolumeSpecName "kube-api-access-89fzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.902240 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c12b9ae4-5d39-4ce1-bca3-8b128038532e" (UID: "c12b9ae4-5d39-4ce1-bca3-8b128038532e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.920802 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-inventory" (OuterVolumeSpecName: "inventory") pod "c12b9ae4-5d39-4ce1-bca3-8b128038532e" (UID: "c12b9ae4-5d39-4ce1-bca3-8b128038532e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.977208 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89fzj\" (UniqueName: \"kubernetes.io/projected/c12b9ae4-5d39-4ce1-bca3-8b128038532e-kube-api-access-89fzj\") on node \"crc\" DevicePath \"\"" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.977266 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:44:26 crc kubenswrapper[4846]: I1122 09:44:26.977286 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12b9ae4-5d39-4ce1-bca3-8b128038532e-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.251690 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" event={"ID":"c12b9ae4-5d39-4ce1-bca3-8b128038532e","Type":"ContainerDied","Data":"cb7d1c328bf68252c474402c7b44222bb4d158198a5c8061f147acfcfa3acbff"} Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.252202 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7d1c328bf68252c474402c7b44222bb4d158198a5c8061f147acfcfa3acbff" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.252293 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-bs5cw" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.349859 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx"] Nov 22 09:44:27 crc kubenswrapper[4846]: E1122 09:44:27.350538 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12b9ae4-5d39-4ce1-bca3-8b128038532e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.350608 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12b9ae4-5d39-4ce1-bca3-8b128038532e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.350822 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12b9ae4-5d39-4ce1-bca3-8b128038532e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.352631 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.358841 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.359101 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.359312 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.359691 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.369298 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx"] Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.490331 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.490446 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.490523 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjxn4\" (UniqueName: \"kubernetes.io/projected/3db36453-67bc-491e-b87f-df3a840178b1-kube-api-access-bjxn4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.593423 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.593502 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.593591 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjxn4\" (UniqueName: \"kubernetes.io/projected/3db36453-67bc-491e-b87f-df3a840178b1-kube-api-access-bjxn4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.599273 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.599943 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.618138 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjxn4\" (UniqueName: \"kubernetes.io/projected/3db36453-67bc-491e-b87f-df3a840178b1-kube-api-access-bjxn4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-prvrx\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:27 crc kubenswrapper[4846]: I1122 09:44:27.683230 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:44:28 crc kubenswrapper[4846]: I1122 09:44:28.335282 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx"] Nov 22 09:44:29 crc kubenswrapper[4846]: I1122 09:44:29.286564 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" event={"ID":"3db36453-67bc-491e-b87f-df3a840178b1","Type":"ContainerStarted","Data":"ed59d4ff1f3b71c4608be78d871b2e8377e8fafdac04bf2203472f816a4943e3"} Nov 22 09:44:29 crc kubenswrapper[4846]: I1122 09:44:29.287222 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" event={"ID":"3db36453-67bc-491e-b87f-df3a840178b1","Type":"ContainerStarted","Data":"ffe9debe344517d482b7325ecb88904efbac0bfd9b1bd29fb8a937157afac132"} Nov 22 09:44:29 crc kubenswrapper[4846]: I1122 09:44:29.321598 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" podStartSLOduration=1.866655486 podStartE2EDuration="2.321563934s" podCreationTimestamp="2025-11-22 09:44:27 +0000 UTC" firstStartedPulling="2025-11-22 09:44:28.338587302 +0000 UTC m=+1843.274276951" lastFinishedPulling="2025-11-22 09:44:28.79349571 +0000 UTC m=+1843.729185399" observedRunningTime="2025-11-22 09:44:29.30791124 +0000 UTC m=+1844.243600899" watchObservedRunningTime="2025-11-22 09:44:29.321563934 +0000 UTC m=+1844.257253633" Nov 22 09:44:33 crc kubenswrapper[4846]: I1122 09:44:33.035959 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:44:33 crc kubenswrapper[4846]: E1122 09:44:33.037228 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:44:47 crc kubenswrapper[4846]: I1122 09:44:47.036256 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:44:47 crc kubenswrapper[4846]: E1122 09:44:47.037862 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:44:58 crc kubenswrapper[4846]: I1122 09:44:58.036169 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:44:58 crc kubenswrapper[4846]: E1122 09:44:58.037593 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.170837 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6"] Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.173389 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.176904 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.177180 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.185604 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6"] Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.270414 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-config-volume\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.270537 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7w7\" (UniqueName: \"kubernetes.io/projected/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-kube-api-access-kd7w7\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.270631 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-secret-volume\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.372277 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-config-volume\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.372426 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7w7\" (UniqueName: \"kubernetes.io/projected/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-kube-api-access-kd7w7\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.372599 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-secret-volume\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.373912 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-config-volume\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.383683 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-secret-volume\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.405162 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7w7\" (UniqueName: \"kubernetes.io/projected/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-kube-api-access-kd7w7\") pod \"collect-profiles-29396745-6v8z6\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:00 crc kubenswrapper[4846]: I1122 09:45:00.513785 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:01 crc kubenswrapper[4846]: I1122 09:45:01.020682 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6"] Nov 22 09:45:01 crc kubenswrapper[4846]: I1122 09:45:01.669362 4846 generic.go:334] "Generic (PLEG): container finished" podID="43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" containerID="f1d9c6f83991ffffd979d00041bb0d3685f477fbc4d195adb605c3b86e04d95b" exitCode=0 Nov 22 09:45:01 crc kubenswrapper[4846]: I1122 09:45:01.669437 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" event={"ID":"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9","Type":"ContainerDied","Data":"f1d9c6f83991ffffd979d00041bb0d3685f477fbc4d195adb605c3b86e04d95b"} Nov 22 09:45:01 crc kubenswrapper[4846]: I1122 09:45:01.669637 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" event={"ID":"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9","Type":"ContainerStarted","Data":"ea7839b816ee537a0d4452590f0525df81a7ed1a016de25a5dee2c53480ef49d"} Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.128663 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.230943 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-secret-volume\") pod \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.231127 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-config-volume\") pod \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.231993 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" (UID: "43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.232141 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd7w7\" (UniqueName: \"kubernetes.io/projected/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-kube-api-access-kd7w7\") pod \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\" (UID: \"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9\") " Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.232547 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.237814 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" (UID: "43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.245671 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-kube-api-access-kd7w7" (OuterVolumeSpecName: "kube-api-access-kd7w7") pod "43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" (UID: "43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9"). InnerVolumeSpecName "kube-api-access-kd7w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.334156 4846 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.334206 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd7w7\" (UniqueName: \"kubernetes.io/projected/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9-kube-api-access-kd7w7\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.694174 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" event={"ID":"43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9","Type":"ContainerDied","Data":"ea7839b816ee537a0d4452590f0525df81a7ed1a016de25a5dee2c53480ef49d"} Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.694595 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7839b816ee537a0d4452590f0525df81a7ed1a016de25a5dee2c53480ef49d" Nov 22 09:45:03 crc kubenswrapper[4846]: I1122 09:45:03.694640 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6" Nov 22 09:45:09 crc kubenswrapper[4846]: I1122 09:45:09.036139 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:45:09 crc kubenswrapper[4846]: E1122 09:45:09.037628 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:45:09 crc kubenswrapper[4846]: I1122 09:45:09.075678 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-lvfpk"] Nov 22 09:45:09 crc kubenswrapper[4846]: I1122 09:45:09.087226 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-lvfpk"] Nov 22 09:45:10 crc kubenswrapper[4846]: I1122 09:45:10.056035 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b782bb33-f288-4cea-8e09-f89cf68c5154" path="/var/lib/kubelet/pods/b782bb33-f288-4cea-8e09-f89cf68c5154/volumes" Nov 22 09:45:15 crc kubenswrapper[4846]: I1122 09:45:15.182216 4846 scope.go:117] "RemoveContainer" containerID="332780e64e6f2a94d91f66052b9db84174170d0c9a16946788c4eda32334cf68" Nov 22 09:45:15 crc kubenswrapper[4846]: I1122 09:45:15.229198 4846 scope.go:117] "RemoveContainer" containerID="aa8e32ec449b393f08938d6d53fade9a75e07cda08e61026e31045d7a266c6fb" Nov 22 09:45:15 crc kubenswrapper[4846]: I1122 09:45:15.312592 4846 scope.go:117] "RemoveContainer" containerID="e318969a15becfc0ebbc3f5637565e5a61c752953d42935e2586fb34b690e022" Nov 22 09:45:20 crc kubenswrapper[4846]: I1122 09:45:20.035696 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:45:20 crc kubenswrapper[4846]: E1122 09:45:20.037206 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:45:27 crc kubenswrapper[4846]: I1122 09:45:27.003189 4846 generic.go:334] "Generic (PLEG): container finished" podID="3db36453-67bc-491e-b87f-df3a840178b1" containerID="ed59d4ff1f3b71c4608be78d871b2e8377e8fafdac04bf2203472f816a4943e3" exitCode=0 Nov 22 09:45:27 crc kubenswrapper[4846]: I1122 09:45:27.003273 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" event={"ID":"3db36453-67bc-491e-b87f-df3a840178b1","Type":"ContainerDied","Data":"ed59d4ff1f3b71c4608be78d871b2e8377e8fafdac04bf2203472f816a4943e3"} Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.593917 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.601686 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjxn4\" (UniqueName: \"kubernetes.io/projected/3db36453-67bc-491e-b87f-df3a840178b1-kube-api-access-bjxn4\") pod \"3db36453-67bc-491e-b87f-df3a840178b1\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.612161 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db36453-67bc-491e-b87f-df3a840178b1-kube-api-access-bjxn4" (OuterVolumeSpecName: "kube-api-access-bjxn4") pod "3db36453-67bc-491e-b87f-df3a840178b1" (UID: "3db36453-67bc-491e-b87f-df3a840178b1"). InnerVolumeSpecName "kube-api-access-bjxn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.703805 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-ssh-key\") pod \"3db36453-67bc-491e-b87f-df3a840178b1\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.703876 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-inventory\") pod \"3db36453-67bc-491e-b87f-df3a840178b1\" (UID: \"3db36453-67bc-491e-b87f-df3a840178b1\") " Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.704372 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjxn4\" (UniqueName: \"kubernetes.io/projected/3db36453-67bc-491e-b87f-df3a840178b1-kube-api-access-bjxn4\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.739201 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3db36453-67bc-491e-b87f-df3a840178b1" (UID: "3db36453-67bc-491e-b87f-df3a840178b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.739215 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-inventory" (OuterVolumeSpecName: "inventory") pod "3db36453-67bc-491e-b87f-df3a840178b1" (UID: "3db36453-67bc-491e-b87f-df3a840178b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.809367 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:28 crc kubenswrapper[4846]: I1122 09:45:28.809412 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db36453-67bc-491e-b87f-df3a840178b1-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.036634 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" event={"ID":"3db36453-67bc-491e-b87f-df3a840178b1","Type":"ContainerDied","Data":"ffe9debe344517d482b7325ecb88904efbac0bfd9b1bd29fb8a937157afac132"} Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.036737 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe9debe344517d482b7325ecb88904efbac0bfd9b1bd29fb8a937157afac132" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.036755 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-prvrx" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.150416 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t6fpn"] Nov 22 09:45:29 crc kubenswrapper[4846]: E1122 09:45:29.150845 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" containerName="collect-profiles" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.150857 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" containerName="collect-profiles" Nov 22 09:45:29 crc kubenswrapper[4846]: E1122 09:45:29.150871 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db36453-67bc-491e-b87f-df3a840178b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.150878 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db36453-67bc-491e-b87f-df3a840178b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.151163 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db36453-67bc-491e-b87f-df3a840178b1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.151180 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" containerName="collect-profiles" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.151784 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.157513 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.157545 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.158329 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.159431 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.162866 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t6fpn"] Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.319213 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.319529 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.319570 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg962\" (UniqueName: \"kubernetes.io/projected/cf943f33-8c4e-4195-aa85-c1f60841b9ab-kube-api-access-pg962\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.421765 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.421825 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.421879 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg962\" (UniqueName: \"kubernetes.io/projected/cf943f33-8c4e-4195-aa85-c1f60841b9ab-kube-api-access-pg962\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.426917 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.427334 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.443178 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg962\" (UniqueName: \"kubernetes.io/projected/cf943f33-8c4e-4195-aa85-c1f60841b9ab-kube-api-access-pg962\") pod \"ssh-known-hosts-edpm-deployment-t6fpn\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:29 crc kubenswrapper[4846]: I1122 09:45:29.475620 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:30 crc kubenswrapper[4846]: I1122 09:45:30.112744 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t6fpn"] Nov 22 09:45:30 crc kubenswrapper[4846]: I1122 09:45:30.118934 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:45:31 crc kubenswrapper[4846]: I1122 09:45:31.068018 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" event={"ID":"cf943f33-8c4e-4195-aa85-c1f60841b9ab","Type":"ContainerStarted","Data":"dcb012d46048b808a9b28a574003dbb1aab14c164e5d5e2addab6ab2938e036b"} Nov 22 09:45:32 crc kubenswrapper[4846]: I1122 09:45:32.083096 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" event={"ID":"cf943f33-8c4e-4195-aa85-c1f60841b9ab","Type":"ContainerStarted","Data":"8596a64f6f19f164bbdf552abf164de0da1530fd0b3ff76f96417cb2a08536bd"} Nov 22 09:45:32 crc kubenswrapper[4846]: I1122 09:45:32.112804 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" podStartSLOduration=2.186662315 podStartE2EDuration="3.11278234s" podCreationTimestamp="2025-11-22 09:45:29 +0000 UTC" firstStartedPulling="2025-11-22 09:45:30.118710135 +0000 UTC m=+1905.054399774" lastFinishedPulling="2025-11-22 09:45:31.04483012 +0000 UTC m=+1905.980519799" observedRunningTime="2025-11-22 09:45:32.105328426 +0000 UTC m=+1907.041018115" watchObservedRunningTime="2025-11-22 09:45:32.11278234 +0000 UTC m=+1907.048471999" Nov 22 09:45:34 crc kubenswrapper[4846]: I1122 09:45:34.041745 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:45:35 crc kubenswrapper[4846]: I1122 09:45:35.122121 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"6fddd433fca0a7c496840e53210f130dc975d91035c6bf896d27eba8ebfc15e7"} Nov 22 09:45:39 crc kubenswrapper[4846]: I1122 09:45:39.181166 4846 generic.go:334] "Generic (PLEG): container finished" podID="cf943f33-8c4e-4195-aa85-c1f60841b9ab" containerID="8596a64f6f19f164bbdf552abf164de0da1530fd0b3ff76f96417cb2a08536bd" exitCode=0 Nov 22 09:45:39 crc kubenswrapper[4846]: I1122 09:45:39.181221 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" event={"ID":"cf943f33-8c4e-4195-aa85-c1f60841b9ab","Type":"ContainerDied","Data":"8596a64f6f19f164bbdf552abf164de0da1530fd0b3ff76f96417cb2a08536bd"} Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.637434 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.672832 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-inventory-0\") pod \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.673343 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg962\" (UniqueName: \"kubernetes.io/projected/cf943f33-8c4e-4195-aa85-c1f60841b9ab-kube-api-access-pg962\") pod \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.673615 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-ssh-key-openstack-edpm-ipam\") pod \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\" (UID: \"cf943f33-8c4e-4195-aa85-c1f60841b9ab\") " Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.681263 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf943f33-8c4e-4195-aa85-c1f60841b9ab-kube-api-access-pg962" (OuterVolumeSpecName: "kube-api-access-pg962") pod "cf943f33-8c4e-4195-aa85-c1f60841b9ab" (UID: "cf943f33-8c4e-4195-aa85-c1f60841b9ab"). InnerVolumeSpecName "kube-api-access-pg962". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.703272 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cf943f33-8c4e-4195-aa85-c1f60841b9ab" (UID: "cf943f33-8c4e-4195-aa85-c1f60841b9ab"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.712540 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cf943f33-8c4e-4195-aa85-c1f60841b9ab" (UID: "cf943f33-8c4e-4195-aa85-c1f60841b9ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.776073 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.776103 4846 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cf943f33-8c4e-4195-aa85-c1f60841b9ab-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:40 crc kubenswrapper[4846]: I1122 09:45:40.776112 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg962\" (UniqueName: \"kubernetes.io/projected/cf943f33-8c4e-4195-aa85-c1f60841b9ab-kube-api-access-pg962\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.210525 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" event={"ID":"cf943f33-8c4e-4195-aa85-c1f60841b9ab","Type":"ContainerDied","Data":"dcb012d46048b808a9b28a574003dbb1aab14c164e5d5e2addab6ab2938e036b"} Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.210574 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb012d46048b808a9b28a574003dbb1aab14c164e5d5e2addab6ab2938e036b" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.210979 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t6fpn" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.321704 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4"] Nov 22 09:45:41 crc kubenswrapper[4846]: E1122 09:45:41.322463 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf943f33-8c4e-4195-aa85-c1f60841b9ab" containerName="ssh-known-hosts-edpm-deployment" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.322503 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf943f33-8c4e-4195-aa85-c1f60841b9ab" containerName="ssh-known-hosts-edpm-deployment" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.322938 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf943f33-8c4e-4195-aa85-c1f60841b9ab" containerName="ssh-known-hosts-edpm-deployment" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.324177 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.326729 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.326962 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.327796 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.329728 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.333194 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4"] Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.390342 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.390425 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.390524 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzdpd\" (UniqueName: \"kubernetes.io/projected/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-kube-api-access-nzdpd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.492493 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.492645 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.492857 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzdpd\" (UniqueName: \"kubernetes.io/projected/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-kube-api-access-nzdpd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.497218 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.499757 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.514149 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzdpd\" (UniqueName: \"kubernetes.io/projected/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-kube-api-access-nzdpd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkqr4\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:41 crc kubenswrapper[4846]: I1122 09:45:41.652693 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:42 crc kubenswrapper[4846]: I1122 09:45:42.328384 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4"] Nov 22 09:45:43 crc kubenswrapper[4846]: I1122 09:45:43.245887 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" event={"ID":"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297","Type":"ContainerStarted","Data":"efef4f4f26848e10788ba7e7add39f97360eb136c89ffd86677de188f1fdddbf"} Nov 22 09:45:44 crc kubenswrapper[4846]: I1122 09:45:44.256298 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" event={"ID":"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297","Type":"ContainerStarted","Data":"5c0cc58ad13a33bf42bb10e15f19333054a5e4a047f9d882cc20160983ce805f"} Nov 22 09:45:44 crc kubenswrapper[4846]: I1122 09:45:44.283388 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" podStartSLOduration=2.470436606 podStartE2EDuration="3.283365419s" podCreationTimestamp="2025-11-22 09:45:41 +0000 UTC" firstStartedPulling="2025-11-22 09:45:42.344303819 +0000 UTC m=+1917.279993508" lastFinishedPulling="2025-11-22 09:45:43.157232642 +0000 UTC m=+1918.092922321" observedRunningTime="2025-11-22 09:45:44.274801032 +0000 UTC m=+1919.210490691" watchObservedRunningTime="2025-11-22 09:45:44.283365419 +0000 UTC m=+1919.219055078" Nov 22 09:45:52 crc kubenswrapper[4846]: I1122 09:45:52.367024 4846 generic.go:334] "Generic (PLEG): container finished" podID="6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" containerID="5c0cc58ad13a33bf42bb10e15f19333054a5e4a047f9d882cc20160983ce805f" exitCode=0 Nov 22 09:45:52 crc kubenswrapper[4846]: I1122 09:45:52.367119 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" event={"ID":"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297","Type":"ContainerDied","Data":"5c0cc58ad13a33bf42bb10e15f19333054a5e4a047f9d882cc20160983ce805f"} Nov 22 09:45:53 crc kubenswrapper[4846]: I1122 09:45:53.858520 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:53 crc kubenswrapper[4846]: I1122 09:45:53.982422 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzdpd\" (UniqueName: \"kubernetes.io/projected/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-kube-api-access-nzdpd\") pod \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " Nov 22 09:45:53 crc kubenswrapper[4846]: I1122 09:45:53.982499 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-ssh-key\") pod \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " Nov 22 09:45:53 crc kubenswrapper[4846]: I1122 09:45:53.982674 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-inventory\") pod \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\" (UID: \"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297\") " Nov 22 09:45:53 crc kubenswrapper[4846]: I1122 09:45:53.988279 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-kube-api-access-nzdpd" (OuterVolumeSpecName: "kube-api-access-nzdpd") pod "6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" (UID: "6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297"). InnerVolumeSpecName "kube-api-access-nzdpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.011554 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-inventory" (OuterVolumeSpecName: "inventory") pod "6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" (UID: "6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.013049 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" (UID: "6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.103027 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.103099 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzdpd\" (UniqueName: \"kubernetes.io/projected/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-kube-api-access-nzdpd\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.103118 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.394901 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" event={"ID":"6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297","Type":"ContainerDied","Data":"efef4f4f26848e10788ba7e7add39f97360eb136c89ffd86677de188f1fdddbf"} Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.395820 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efef4f4f26848e10788ba7e7add39f97360eb136c89ffd86677de188f1fdddbf" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.395003 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkqr4" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.516368 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg"] Nov 22 09:45:54 crc kubenswrapper[4846]: E1122 09:45:54.516890 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.516921 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.517262 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.520653 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.523826 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.524409 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.524468 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.525373 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.533791 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg"] Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.614260 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wlb\" (UniqueName: \"kubernetes.io/projected/0364c9c7-ad57-4109-bdf0-9c888a609515-kube-api-access-56wlb\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.614560 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.614989 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.717252 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.717364 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wlb\" (UniqueName: \"kubernetes.io/projected/0364c9c7-ad57-4109-bdf0-9c888a609515-kube-api-access-56wlb\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.717424 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.722696 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.731830 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.736763 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wlb\" (UniqueName: \"kubernetes.io/projected/0364c9c7-ad57-4109-bdf0-9c888a609515-kube-api-access-56wlb\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:54 crc kubenswrapper[4846]: I1122 09:45:54.866640 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:45:55 crc kubenswrapper[4846]: I1122 09:45:55.510564 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg"] Nov 22 09:45:55 crc kubenswrapper[4846]: W1122 09:45:55.518588 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0364c9c7_ad57_4109_bdf0_9c888a609515.slice/crio-d916e620bc5971a357725e1ddd989b91a29524039d1ca62222381b363b78c494 WatchSource:0}: Error finding container d916e620bc5971a357725e1ddd989b91a29524039d1ca62222381b363b78c494: Status 404 returned error can't find the container with id d916e620bc5971a357725e1ddd989b91a29524039d1ca62222381b363b78c494 Nov 22 09:45:56 crc kubenswrapper[4846]: I1122 09:45:56.416853 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" event={"ID":"0364c9c7-ad57-4109-bdf0-9c888a609515","Type":"ContainerStarted","Data":"da241bf800d9c3cfe4bbec8ecfc51e61f42a28fb41ff6e7adc14a32a1830a927"} Nov 22 09:45:56 crc kubenswrapper[4846]: I1122 09:45:56.417430 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" event={"ID":"0364c9c7-ad57-4109-bdf0-9c888a609515","Type":"ContainerStarted","Data":"d916e620bc5971a357725e1ddd989b91a29524039d1ca62222381b363b78c494"} Nov 22 09:45:56 crc kubenswrapper[4846]: I1122 09:45:56.434883 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" podStartSLOduration=2.017164009 podStartE2EDuration="2.434864994s" podCreationTimestamp="2025-11-22 09:45:54 +0000 UTC" firstStartedPulling="2025-11-22 09:45:55.522173717 +0000 UTC m=+1930.457863356" lastFinishedPulling="2025-11-22 09:45:55.939874702 +0000 UTC m=+1930.875564341" observedRunningTime="2025-11-22 09:45:56.430225741 +0000 UTC m=+1931.365915390" watchObservedRunningTime="2025-11-22 09:45:56.434864994 +0000 UTC m=+1931.370554633" Nov 22 09:46:07 crc kubenswrapper[4846]: I1122 09:46:07.549262 4846 generic.go:334] "Generic (PLEG): container finished" podID="0364c9c7-ad57-4109-bdf0-9c888a609515" containerID="da241bf800d9c3cfe4bbec8ecfc51e61f42a28fb41ff6e7adc14a32a1830a927" exitCode=0 Nov 22 09:46:07 crc kubenswrapper[4846]: I1122 09:46:07.549335 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" event={"ID":"0364c9c7-ad57-4109-bdf0-9c888a609515","Type":"ContainerDied","Data":"da241bf800d9c3cfe4bbec8ecfc51e61f42a28fb41ff6e7adc14a32a1830a927"} Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.125006 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.270160 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-ssh-key\") pod \"0364c9c7-ad57-4109-bdf0-9c888a609515\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.270496 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56wlb\" (UniqueName: \"kubernetes.io/projected/0364c9c7-ad57-4109-bdf0-9c888a609515-kube-api-access-56wlb\") pod \"0364c9c7-ad57-4109-bdf0-9c888a609515\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.270522 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-inventory\") pod \"0364c9c7-ad57-4109-bdf0-9c888a609515\" (UID: \"0364c9c7-ad57-4109-bdf0-9c888a609515\") " Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.285734 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0364c9c7-ad57-4109-bdf0-9c888a609515-kube-api-access-56wlb" (OuterVolumeSpecName: "kube-api-access-56wlb") pod "0364c9c7-ad57-4109-bdf0-9c888a609515" (UID: "0364c9c7-ad57-4109-bdf0-9c888a609515"). InnerVolumeSpecName "kube-api-access-56wlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.300019 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0364c9c7-ad57-4109-bdf0-9c888a609515" (UID: "0364c9c7-ad57-4109-bdf0-9c888a609515"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.313637 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-inventory" (OuterVolumeSpecName: "inventory") pod "0364c9c7-ad57-4109-bdf0-9c888a609515" (UID: "0364c9c7-ad57-4109-bdf0-9c888a609515"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.372345 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56wlb\" (UniqueName: \"kubernetes.io/projected/0364c9c7-ad57-4109-bdf0-9c888a609515-kube-api-access-56wlb\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.372578 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.372637 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0364c9c7-ad57-4109-bdf0-9c888a609515-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.577428 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" event={"ID":"0364c9c7-ad57-4109-bdf0-9c888a609515","Type":"ContainerDied","Data":"d916e620bc5971a357725e1ddd989b91a29524039d1ca62222381b363b78c494"} Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.577500 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d916e620bc5971a357725e1ddd989b91a29524039d1ca62222381b363b78c494" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.577626 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.730303 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn"] Nov 22 09:46:09 crc kubenswrapper[4846]: E1122 09:46:09.731126 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0364c9c7-ad57-4109-bdf0-9c888a609515" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.731144 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0364c9c7-ad57-4109-bdf0-9c888a609515" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.731319 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0364c9c7-ad57-4109-bdf0-9c888a609515" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.732045 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.736711 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.736756 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.736756 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.736875 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.736968 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.737091 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.737121 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.737283 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.747899 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn"] Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.810437 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.810584 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.810619 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.810848 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811180 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811252 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811298 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811328 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811368 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811431 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811496 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811576 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmq7\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-kube-api-access-cwmq7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811638 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.811695 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914093 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914235 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914282 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914325 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914361 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914444 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914506 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914571 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914625 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmq7\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-kube-api-access-cwmq7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914669 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914720 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914786 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914901 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.914939 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.922421 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.922633 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.923173 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.923321 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.923751 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.924215 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.924332 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.925341 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.926669 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.927417 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.928433 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.929371 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.933612 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:09 crc kubenswrapper[4846]: I1122 09:46:09.934581 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmq7\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-kube-api-access-cwmq7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-78vtn\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:10 crc kubenswrapper[4846]: I1122 09:46:10.112748 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:10 crc kubenswrapper[4846]: I1122 09:46:10.505307 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn"] Nov 22 09:46:10 crc kubenswrapper[4846]: I1122 09:46:10.591640 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" event={"ID":"39001bd9-e368-4530-be7d-97c756cb4d39","Type":"ContainerStarted","Data":"a5ab57a19bc56acfaf8698cd99ed8679753e733461d4df9e230951e3a6fb9a62"} Nov 22 09:46:16 crc kubenswrapper[4846]: I1122 09:46:16.654993 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" event={"ID":"39001bd9-e368-4530-be7d-97c756cb4d39","Type":"ContainerStarted","Data":"c1c0fe315279e80c6978904a60d06f529959a03170e98be132531b00a094ea40"} Nov 22 09:46:16 crc kubenswrapper[4846]: I1122 09:46:16.696782 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" podStartSLOduration=2.10253538 podStartE2EDuration="7.696751066s" podCreationTimestamp="2025-11-22 09:46:09 +0000 UTC" firstStartedPulling="2025-11-22 09:46:10.511912833 +0000 UTC m=+1945.447602492" lastFinishedPulling="2025-11-22 09:46:16.106128529 +0000 UTC m=+1951.041818178" observedRunningTime="2025-11-22 09:46:16.689030433 +0000 UTC m=+1951.624720122" watchObservedRunningTime="2025-11-22 09:46:16.696751066 +0000 UTC m=+1951.632440745" Nov 22 09:46:58 crc kubenswrapper[4846]: I1122 09:46:58.096845 4846 generic.go:334] "Generic (PLEG): container finished" podID="39001bd9-e368-4530-be7d-97c756cb4d39" containerID="c1c0fe315279e80c6978904a60d06f529959a03170e98be132531b00a094ea40" exitCode=0 Nov 22 09:46:58 crc kubenswrapper[4846]: I1122 09:46:58.096966 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" event={"ID":"39001bd9-e368-4530-be7d-97c756cb4d39","Type":"ContainerDied","Data":"c1c0fe315279e80c6978904a60d06f529959a03170e98be132531b00a094ea40"} Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.588656 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708211 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-repo-setup-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708340 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwmq7\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-kube-api-access-cwmq7\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708453 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708516 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-libvirt-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708723 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708802 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-inventory\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708849 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ovn-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708915 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ssh-key\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.708990 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.709039 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.709113 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-bootstrap-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.709177 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-nova-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.709216 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-neutron-metadata-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.709261 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-telemetry-combined-ca-bundle\") pod \"39001bd9-e368-4530-be7d-97c756cb4d39\" (UID: \"39001bd9-e368-4530-be7d-97c756cb4d39\") " Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.714790 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.715105 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.715481 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.717157 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.718473 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.718904 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-kube-api-access-cwmq7" (OuterVolumeSpecName: "kube-api-access-cwmq7") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "kube-api-access-cwmq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.719291 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.721957 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.722329 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.723975 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.725486 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.733510 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.750766 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-inventory" (OuterVolumeSpecName: "inventory") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.777481 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39001bd9-e368-4530-be7d-97c756cb4d39" (UID: "39001bd9-e368-4530-be7d-97c756cb4d39"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812108 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812248 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812340 4846 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812417 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812494 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812569 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812642 4846 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812721 4846 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812791 4846 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812868 4846 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.812947 4846 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.813017 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwmq7\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-kube-api-access-cwmq7\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.813122 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/39001bd9-e368-4530-be7d-97c756cb4d39-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:46:59 crc kubenswrapper[4846]: I1122 09:46:59.813204 4846 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39001bd9-e368-4530-be7d-97c756cb4d39-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.123209 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" event={"ID":"39001bd9-e368-4530-be7d-97c756cb4d39","Type":"ContainerDied","Data":"a5ab57a19bc56acfaf8698cd99ed8679753e733461d4df9e230951e3a6fb9a62"} Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.123259 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ab57a19bc56acfaf8698cd99ed8679753e733461d4df9e230951e3a6fb9a62" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.123332 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-78vtn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.254187 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn"] Nov 22 09:47:00 crc kubenswrapper[4846]: E1122 09:47:00.257169 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39001bd9-e368-4530-be7d-97c756cb4d39" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.257201 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="39001bd9-e368-4530-be7d-97c756cb4d39" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.257466 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="39001bd9-e368-4530-be7d-97c756cb4d39" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.258203 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.260739 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.260796 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.261676 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.262092 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.262403 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.279097 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn"] Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.323114 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dxm\" (UniqueName: \"kubernetes.io/projected/d326c85b-6234-469b-b6f4-8a4d72b62dab-kube-api-access-99dxm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.323269 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.323298 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.323344 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.323381 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.425292 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.425377 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.425467 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.425541 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.425580 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99dxm\" (UniqueName: \"kubernetes.io/projected/d326c85b-6234-469b-b6f4-8a4d72b62dab-kube-api-access-99dxm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.428852 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.431413 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.431754 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.434826 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.448955 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99dxm\" (UniqueName: \"kubernetes.io/projected/d326c85b-6234-469b-b6f4-8a4d72b62dab-kube-api-access-99dxm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rfbkn\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:00 crc kubenswrapper[4846]: I1122 09:47:00.593394 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:47:01 crc kubenswrapper[4846]: I1122 09:47:01.208214 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn"] Nov 22 09:47:01 crc kubenswrapper[4846]: W1122 09:47:01.209400 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd326c85b_6234_469b_b6f4_8a4d72b62dab.slice/crio-eb42df03ee96796b186d0be6d21f890ffa7385e5e0be48e224d670a8d59e1118 WatchSource:0}: Error finding container eb42df03ee96796b186d0be6d21f890ffa7385e5e0be48e224d670a8d59e1118: Status 404 returned error can't find the container with id eb42df03ee96796b186d0be6d21f890ffa7385e5e0be48e224d670a8d59e1118 Nov 22 09:47:02 crc kubenswrapper[4846]: I1122 09:47:02.147541 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" event={"ID":"d326c85b-6234-469b-b6f4-8a4d72b62dab","Type":"ContainerStarted","Data":"c6a30afde122f7079cd4f5b2e777331fa4c9f3ee616e270baa16acf29509e6cb"} Nov 22 09:47:02 crc kubenswrapper[4846]: I1122 09:47:02.147960 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" event={"ID":"d326c85b-6234-469b-b6f4-8a4d72b62dab","Type":"ContainerStarted","Data":"eb42df03ee96796b186d0be6d21f890ffa7385e5e0be48e224d670a8d59e1118"} Nov 22 09:47:02 crc kubenswrapper[4846]: I1122 09:47:02.177225 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" podStartSLOduration=1.730932651 podStartE2EDuration="2.17720285s" podCreationTimestamp="2025-11-22 09:47:00 +0000 UTC" firstStartedPulling="2025-11-22 09:47:01.21166257 +0000 UTC m=+1996.147352229" lastFinishedPulling="2025-11-22 09:47:01.657932749 +0000 UTC m=+1996.593622428" observedRunningTime="2025-11-22 09:47:02.168590112 +0000 UTC m=+1997.104279781" watchObservedRunningTime="2025-11-22 09:47:02.17720285 +0000 UTC m=+1997.112892519" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.726665 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8vqzt"] Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.731327 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.750509 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8vqzt"] Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.819408 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-catalog-content\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.819466 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzl2d\" (UniqueName: \"kubernetes.io/projected/19b70b41-71c0-478d-8634-4de5d2e51e34-kube-api-access-mzl2d\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.819517 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-utilities\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.921385 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-catalog-content\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.921438 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzl2d\" (UniqueName: \"kubernetes.io/projected/19b70b41-71c0-478d-8634-4de5d2e51e34-kube-api-access-mzl2d\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.921483 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-utilities\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.921925 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-utilities\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.922161 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-catalog-content\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:04 crc kubenswrapper[4846]: I1122 09:47:04.945976 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzl2d\" (UniqueName: \"kubernetes.io/projected/19b70b41-71c0-478d-8634-4de5d2e51e34-kube-api-access-mzl2d\") pod \"redhat-operators-8vqzt\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:05 crc kubenswrapper[4846]: I1122 09:47:05.066963 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:05 crc kubenswrapper[4846]: I1122 09:47:05.522312 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8vqzt"] Nov 22 09:47:05 crc kubenswrapper[4846]: W1122 09:47:05.529388 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b70b41_71c0_478d_8634_4de5d2e51e34.slice/crio-2a434d90660b7dd8f60ec4095c08252ddd73c5369f287df1cfb388ff0bd7b0c7 WatchSource:0}: Error finding container 2a434d90660b7dd8f60ec4095c08252ddd73c5369f287df1cfb388ff0bd7b0c7: Status 404 returned error can't find the container with id 2a434d90660b7dd8f60ec4095c08252ddd73c5369f287df1cfb388ff0bd7b0c7 Nov 22 09:47:06 crc kubenswrapper[4846]: I1122 09:47:06.190904 4846 generic.go:334] "Generic (PLEG): container finished" podID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerID="2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4" exitCode=0 Nov 22 09:47:06 crc kubenswrapper[4846]: I1122 09:47:06.190959 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerDied","Data":"2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4"} Nov 22 09:47:06 crc kubenswrapper[4846]: I1122 09:47:06.191006 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerStarted","Data":"2a434d90660b7dd8f60ec4095c08252ddd73c5369f287df1cfb388ff0bd7b0c7"} Nov 22 09:47:07 crc kubenswrapper[4846]: I1122 09:47:07.203371 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerStarted","Data":"eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2"} Nov 22 09:47:07 crc kubenswrapper[4846]: E1122 09:47:07.829256 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b70b41_71c0_478d_8634_4de5d2e51e34.slice/crio-eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2.scope\": RecentStats: unable to find data in memory cache]" Nov 22 09:47:08 crc kubenswrapper[4846]: I1122 09:47:08.221503 4846 generic.go:334] "Generic (PLEG): container finished" podID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerID="eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2" exitCode=0 Nov 22 09:47:08 crc kubenswrapper[4846]: I1122 09:47:08.221566 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerDied","Data":"eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2"} Nov 22 09:47:09 crc kubenswrapper[4846]: I1122 09:47:09.233678 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerStarted","Data":"c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81"} Nov 22 09:47:10 crc kubenswrapper[4846]: I1122 09:47:10.270468 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8vqzt" podStartSLOduration=3.5951378849999998 podStartE2EDuration="6.270446738s" podCreationTimestamp="2025-11-22 09:47:04 +0000 UTC" firstStartedPulling="2025-11-22 09:47:06.192711378 +0000 UTC m=+2001.128401037" lastFinishedPulling="2025-11-22 09:47:08.868020201 +0000 UTC m=+2003.803709890" observedRunningTime="2025-11-22 09:47:10.268916554 +0000 UTC m=+2005.204606213" watchObservedRunningTime="2025-11-22 09:47:10.270446738 +0000 UTC m=+2005.206136397" Nov 22 09:47:15 crc kubenswrapper[4846]: I1122 09:47:15.067514 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:15 crc kubenswrapper[4846]: I1122 09:47:15.068262 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:16 crc kubenswrapper[4846]: I1122 09:47:16.141861 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8vqzt" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="registry-server" probeResult="failure" output=< Nov 22 09:47:16 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:47:16 crc kubenswrapper[4846]: > Nov 22 09:47:25 crc kubenswrapper[4846]: I1122 09:47:25.161415 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:25 crc kubenswrapper[4846]: I1122 09:47:25.255997 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:25 crc kubenswrapper[4846]: I1122 09:47:25.414526 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8vqzt"] Nov 22 09:47:26 crc kubenswrapper[4846]: I1122 09:47:26.447197 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8vqzt" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="registry-server" containerID="cri-o://c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81" gracePeriod=2 Nov 22 09:47:26 crc kubenswrapper[4846]: I1122 09:47:26.960117 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.085495 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-utilities\") pod \"19b70b41-71c0-478d-8634-4de5d2e51e34\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.085649 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzl2d\" (UniqueName: \"kubernetes.io/projected/19b70b41-71c0-478d-8634-4de5d2e51e34-kube-api-access-mzl2d\") pod \"19b70b41-71c0-478d-8634-4de5d2e51e34\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.085834 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-catalog-content\") pod \"19b70b41-71c0-478d-8634-4de5d2e51e34\" (UID: \"19b70b41-71c0-478d-8634-4de5d2e51e34\") " Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.087032 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-utilities" (OuterVolumeSpecName: "utilities") pod "19b70b41-71c0-478d-8634-4de5d2e51e34" (UID: "19b70b41-71c0-478d-8634-4de5d2e51e34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.088407 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.097631 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b70b41-71c0-478d-8634-4de5d2e51e34-kube-api-access-mzl2d" (OuterVolumeSpecName: "kube-api-access-mzl2d") pod "19b70b41-71c0-478d-8634-4de5d2e51e34" (UID: "19b70b41-71c0-478d-8634-4de5d2e51e34"). InnerVolumeSpecName "kube-api-access-mzl2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.190418 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzl2d\" (UniqueName: \"kubernetes.io/projected/19b70b41-71c0-478d-8634-4de5d2e51e34-kube-api-access-mzl2d\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.201224 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19b70b41-71c0-478d-8634-4de5d2e51e34" (UID: "19b70b41-71c0-478d-8634-4de5d2e51e34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.292574 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b70b41-71c0-478d-8634-4de5d2e51e34-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.457916 4846 generic.go:334] "Generic (PLEG): container finished" podID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerID="c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81" exitCode=0 Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.457976 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerDied","Data":"c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81"} Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.457993 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8vqzt" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.458028 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8vqzt" event={"ID":"19b70b41-71c0-478d-8634-4de5d2e51e34","Type":"ContainerDied","Data":"2a434d90660b7dd8f60ec4095c08252ddd73c5369f287df1cfb388ff0bd7b0c7"} Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.458080 4846 scope.go:117] "RemoveContainer" containerID="c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.486334 4846 scope.go:117] "RemoveContainer" containerID="eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.510137 4846 scope.go:117] "RemoveContainer" containerID="2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.517315 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8vqzt"] Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.529606 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8vqzt"] Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.584453 4846 scope.go:117] "RemoveContainer" containerID="c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81" Nov 22 09:47:27 crc kubenswrapper[4846]: E1122 09:47:27.585181 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81\": container with ID starting with c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81 not found: ID does not exist" containerID="c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.585232 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81"} err="failed to get container status \"c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81\": rpc error: code = NotFound desc = could not find container \"c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81\": container with ID starting with c934e295a83d3aea68dd9b1aa5ed7404bb3d58874be27ae336ec216151aa7e81 not found: ID does not exist" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.585265 4846 scope.go:117] "RemoveContainer" containerID="eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2" Nov 22 09:47:27 crc kubenswrapper[4846]: E1122 09:47:27.585688 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2\": container with ID starting with eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2 not found: ID does not exist" containerID="eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.585720 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2"} err="failed to get container status \"eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2\": rpc error: code = NotFound desc = could not find container \"eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2\": container with ID starting with eec95b354911c4e19165b7de60095fa714e0493ed2608f8a1df878c6917ac5d2 not found: ID does not exist" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.585740 4846 scope.go:117] "RemoveContainer" containerID="2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4" Nov 22 09:47:27 crc kubenswrapper[4846]: E1122 09:47:27.585969 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4\": container with ID starting with 2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4 not found: ID does not exist" containerID="2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4" Nov 22 09:47:27 crc kubenswrapper[4846]: I1122 09:47:27.585993 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4"} err="failed to get container status \"2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4\": rpc error: code = NotFound desc = could not find container \"2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4\": container with ID starting with 2108cd7362ab3006d91f3902205107389f50af070020149c17c043792b4a00b4 not found: ID does not exist" Nov 22 09:47:28 crc kubenswrapper[4846]: I1122 09:47:28.045390 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" path="/var/lib/kubelet/pods/19b70b41-71c0-478d-8634-4de5d2e51e34/volumes" Nov 22 09:47:58 crc kubenswrapper[4846]: I1122 09:47:58.626118 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:47:58 crc kubenswrapper[4846]: I1122 09:47:58.626709 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:48:11 crc kubenswrapper[4846]: I1122 09:48:11.968239 4846 generic.go:334] "Generic (PLEG): container finished" podID="d326c85b-6234-469b-b6f4-8a4d72b62dab" containerID="c6a30afde122f7079cd4f5b2e777331fa4c9f3ee616e270baa16acf29509e6cb" exitCode=0 Nov 22 09:48:11 crc kubenswrapper[4846]: I1122 09:48:11.968372 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" event={"ID":"d326c85b-6234-469b-b6f4-8a4d72b62dab","Type":"ContainerDied","Data":"c6a30afde122f7079cd4f5b2e777331fa4c9f3ee616e270baa16acf29509e6cb"} Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.536862 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.539503 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-inventory\") pod \"d326c85b-6234-469b-b6f4-8a4d72b62dab\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.539654 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ssh-key\") pod \"d326c85b-6234-469b-b6f4-8a4d72b62dab\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.539701 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovn-combined-ca-bundle\") pod \"d326c85b-6234-469b-b6f4-8a4d72b62dab\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.539769 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovncontroller-config-0\") pod \"d326c85b-6234-469b-b6f4-8a4d72b62dab\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.539883 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99dxm\" (UniqueName: \"kubernetes.io/projected/d326c85b-6234-469b-b6f4-8a4d72b62dab-kube-api-access-99dxm\") pod \"d326c85b-6234-469b-b6f4-8a4d72b62dab\" (UID: \"d326c85b-6234-469b-b6f4-8a4d72b62dab\") " Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.546292 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d326c85b-6234-469b-b6f4-8a4d72b62dab-kube-api-access-99dxm" (OuterVolumeSpecName: "kube-api-access-99dxm") pod "d326c85b-6234-469b-b6f4-8a4d72b62dab" (UID: "d326c85b-6234-469b-b6f4-8a4d72b62dab"). InnerVolumeSpecName "kube-api-access-99dxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.547432 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d326c85b-6234-469b-b6f4-8a4d72b62dab" (UID: "d326c85b-6234-469b-b6f4-8a4d72b62dab"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.580989 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d326c85b-6234-469b-b6f4-8a4d72b62dab" (UID: "d326c85b-6234-469b-b6f4-8a4d72b62dab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.584240 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d326c85b-6234-469b-b6f4-8a4d72b62dab" (UID: "d326c85b-6234-469b-b6f4-8a4d72b62dab"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.595260 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-inventory" (OuterVolumeSpecName: "inventory") pod "d326c85b-6234-469b-b6f4-8a4d72b62dab" (UID: "d326c85b-6234-469b-b6f4-8a4d72b62dab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.656385 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.656425 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.656435 4846 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.656448 4846 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d326c85b-6234-469b-b6f4-8a4d72b62dab-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.656460 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99dxm\" (UniqueName: \"kubernetes.io/projected/d326c85b-6234-469b-b6f4-8a4d72b62dab-kube-api-access-99dxm\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.992113 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" event={"ID":"d326c85b-6234-469b-b6f4-8a4d72b62dab","Type":"ContainerDied","Data":"eb42df03ee96796b186d0be6d21f890ffa7385e5e0be48e224d670a8d59e1118"} Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.992473 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb42df03ee96796b186d0be6d21f890ffa7385e5e0be48e224d670a8d59e1118" Nov 22 09:48:13 crc kubenswrapper[4846]: I1122 09:48:13.992147 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rfbkn" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.111888 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4"] Nov 22 09:48:14 crc kubenswrapper[4846]: E1122 09:48:14.112365 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="extract-utilities" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.112389 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="extract-utilities" Nov 22 09:48:14 crc kubenswrapper[4846]: E1122 09:48:14.112410 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d326c85b-6234-469b-b6f4-8a4d72b62dab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.112419 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d326c85b-6234-469b-b6f4-8a4d72b62dab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 09:48:14 crc kubenswrapper[4846]: E1122 09:48:14.112447 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="extract-content" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.112455 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="extract-content" Nov 22 09:48:14 crc kubenswrapper[4846]: E1122 09:48:14.112479 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="registry-server" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.112487 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="registry-server" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.112696 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="d326c85b-6234-469b-b6f4-8a4d72b62dab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.112760 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b70b41-71c0-478d-8634-4de5d2e51e34" containerName="registry-server" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.113495 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.119224 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.119433 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.119543 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.119705 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.119815 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.119917 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.129003 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4"] Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.165455 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.165515 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.165542 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.165599 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.165637 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drv8\" (UniqueName: \"kubernetes.io/projected/34347b18-5391-4078-8165-175276d8747e-kube-api-access-6drv8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.165673 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.266984 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drv8\" (UniqueName: \"kubernetes.io/projected/34347b18-5391-4078-8165-175276d8747e-kube-api-access-6drv8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.267070 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.267130 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.267161 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.267185 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.267425 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.271529 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.271547 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.271743 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.271747 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.275100 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.282892 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drv8\" (UniqueName: \"kubernetes.io/projected/34347b18-5391-4078-8165-175276d8747e-kube-api-access-6drv8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.457322 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:48:14 crc kubenswrapper[4846]: I1122 09:48:14.998671 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4"] Nov 22 09:48:15 crc kubenswrapper[4846]: W1122 09:48:15.005187 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34347b18_5391_4078_8165_175276d8747e.slice/crio-9f4e946c6bb35eb4e78e0bc109808414747f800539554553b8d9173b8d2cf68a WatchSource:0}: Error finding container 9f4e946c6bb35eb4e78e0bc109808414747f800539554553b8d9173b8d2cf68a: Status 404 returned error can't find the container with id 9f4e946c6bb35eb4e78e0bc109808414747f800539554553b8d9173b8d2cf68a Nov 22 09:48:16 crc kubenswrapper[4846]: I1122 09:48:16.023702 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" event={"ID":"34347b18-5391-4078-8165-175276d8747e","Type":"ContainerStarted","Data":"2dfbdaa78530bbf80b3ef028e492fd7bbbd2f2fd935f38a888c2ad0b2d044045"} Nov 22 09:48:16 crc kubenswrapper[4846]: I1122 09:48:16.025519 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" event={"ID":"34347b18-5391-4078-8165-175276d8747e","Type":"ContainerStarted","Data":"9f4e946c6bb35eb4e78e0bc109808414747f800539554553b8d9173b8d2cf68a"} Nov 22 09:48:16 crc kubenswrapper[4846]: I1122 09:48:16.057021 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" podStartSLOduration=1.584805947 podStartE2EDuration="2.056965525s" podCreationTimestamp="2025-11-22 09:48:14 +0000 UTC" firstStartedPulling="2025-11-22 09:48:15.010031585 +0000 UTC m=+2069.945721234" lastFinishedPulling="2025-11-22 09:48:15.482191103 +0000 UTC m=+2070.417880812" observedRunningTime="2025-11-22 09:48:16.054920627 +0000 UTC m=+2070.990610316" watchObservedRunningTime="2025-11-22 09:48:16.056965525 +0000 UTC m=+2070.992655214" Nov 22 09:48:28 crc kubenswrapper[4846]: I1122 09:48:28.625407 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:48:28 crc kubenswrapper[4846]: I1122 09:48:28.625935 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:48:58 crc kubenswrapper[4846]: I1122 09:48:58.626214 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:48:58 crc kubenswrapper[4846]: I1122 09:48:58.626905 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:48:58 crc kubenswrapper[4846]: I1122 09:48:58.626991 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:48:58 crc kubenswrapper[4846]: I1122 09:48:58.628402 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fddd433fca0a7c496840e53210f130dc975d91035c6bf896d27eba8ebfc15e7"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:48:58 crc kubenswrapper[4846]: I1122 09:48:58.628544 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://6fddd433fca0a7c496840e53210f130dc975d91035c6bf896d27eba8ebfc15e7" gracePeriod=600 Nov 22 09:48:59 crc kubenswrapper[4846]: I1122 09:48:59.538458 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="6fddd433fca0a7c496840e53210f130dc975d91035c6bf896d27eba8ebfc15e7" exitCode=0 Nov 22 09:48:59 crc kubenswrapper[4846]: I1122 09:48:59.538555 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"6fddd433fca0a7c496840e53210f130dc975d91035c6bf896d27eba8ebfc15e7"} Nov 22 09:48:59 crc kubenswrapper[4846]: I1122 09:48:59.539761 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e"} Nov 22 09:48:59 crc kubenswrapper[4846]: I1122 09:48:59.539849 4846 scope.go:117] "RemoveContainer" containerID="16c957abcb558553e7986afc9a8a624b69c0f29ef51dbaec9cb9fa5f6cd5fa7b" Nov 22 09:49:17 crc kubenswrapper[4846]: I1122 09:49:17.755767 4846 generic.go:334] "Generic (PLEG): container finished" podID="34347b18-5391-4078-8165-175276d8747e" containerID="2dfbdaa78530bbf80b3ef028e492fd7bbbd2f2fd935f38a888c2ad0b2d044045" exitCode=0 Nov 22 09:49:17 crc kubenswrapper[4846]: I1122 09:49:17.755897 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" event={"ID":"34347b18-5391-4078-8165-175276d8747e","Type":"ContainerDied","Data":"2dfbdaa78530bbf80b3ef028e492fd7bbbd2f2fd935f38a888c2ad0b2d044045"} Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.275664 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.435089 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6drv8\" (UniqueName: \"kubernetes.io/projected/34347b18-5391-4078-8165-175276d8747e-kube-api-access-6drv8\") pod \"34347b18-5391-4078-8165-175276d8747e\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.435609 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-ssh-key\") pod \"34347b18-5391-4078-8165-175276d8747e\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.435650 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-nova-metadata-neutron-config-0\") pod \"34347b18-5391-4078-8165-175276d8747e\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.435697 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-inventory\") pod \"34347b18-5391-4078-8165-175276d8747e\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.435768 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"34347b18-5391-4078-8165-175276d8747e\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.435946 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-metadata-combined-ca-bundle\") pod \"34347b18-5391-4078-8165-175276d8747e\" (UID: \"34347b18-5391-4078-8165-175276d8747e\") " Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.442826 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "34347b18-5391-4078-8165-175276d8747e" (UID: "34347b18-5391-4078-8165-175276d8747e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.446369 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34347b18-5391-4078-8165-175276d8747e-kube-api-access-6drv8" (OuterVolumeSpecName: "kube-api-access-6drv8") pod "34347b18-5391-4078-8165-175276d8747e" (UID: "34347b18-5391-4078-8165-175276d8747e"). InnerVolumeSpecName "kube-api-access-6drv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.467780 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "34347b18-5391-4078-8165-175276d8747e" (UID: "34347b18-5391-4078-8165-175276d8747e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.476471 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "34347b18-5391-4078-8165-175276d8747e" (UID: "34347b18-5391-4078-8165-175276d8747e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.477410 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-inventory" (OuterVolumeSpecName: "inventory") pod "34347b18-5391-4078-8165-175276d8747e" (UID: "34347b18-5391-4078-8165-175276d8747e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.485915 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "34347b18-5391-4078-8165-175276d8747e" (UID: "34347b18-5391-4078-8165-175276d8747e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.538814 4846 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.538852 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6drv8\" (UniqueName: \"kubernetes.io/projected/34347b18-5391-4078-8165-175276d8747e-kube-api-access-6drv8\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.538897 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.538909 4846 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.538922 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.538934 4846 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/34347b18-5391-4078-8165-175276d8747e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.780603 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" event={"ID":"34347b18-5391-4078-8165-175276d8747e","Type":"ContainerDied","Data":"9f4e946c6bb35eb4e78e0bc109808414747f800539554553b8d9173b8d2cf68a"} Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.780662 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f4e946c6bb35eb4e78e0bc109808414747f800539554553b8d9173b8d2cf68a" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.780700 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.946657 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl"] Nov 22 09:49:19 crc kubenswrapper[4846]: E1122 09:49:19.947448 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34347b18-5391-4078-8165-175276d8747e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.947478 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="34347b18-5391-4078-8165-175276d8747e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.947840 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="34347b18-5391-4078-8165-175276d8747e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.948748 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.960421 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl"] Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.978971 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.979070 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.979265 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.979522 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 22 09:49:19 crc kubenswrapper[4846]: I1122 09:49:19.979614 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.048488 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.048579 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt757\" (UniqueName: \"kubernetes.io/projected/06a4ae02-37d7-458b-879a-64951da9e75a-kube-api-access-tt757\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.048611 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.048638 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.048768 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.150300 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt757\" (UniqueName: \"kubernetes.io/projected/06a4ae02-37d7-458b-879a-64951da9e75a-kube-api-access-tt757\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.150694 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.150839 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.151032 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.151231 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.155557 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.155692 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.159665 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.162742 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.173301 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt757\" (UniqueName: \"kubernetes.io/projected/06a4ae02-37d7-458b-879a-64951da9e75a-kube-api-access-tt757\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.315547 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:49:20 crc kubenswrapper[4846]: I1122 09:49:20.871657 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl"] Nov 22 09:49:21 crc kubenswrapper[4846]: I1122 09:49:21.808437 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" event={"ID":"06a4ae02-37d7-458b-879a-64951da9e75a","Type":"ContainerStarted","Data":"c8643b7b278e09d64d92bf1713695d7dbc4d167175a0bf0703d32bf75c4fc701"} Nov 22 09:49:21 crc kubenswrapper[4846]: I1122 09:49:21.808874 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" event={"ID":"06a4ae02-37d7-458b-879a-64951da9e75a","Type":"ContainerStarted","Data":"0d91e2b688a11b7aa82eea57888224907c00a1319840d9b5a9457d792ef71926"} Nov 22 09:49:21 crc kubenswrapper[4846]: I1122 09:49:21.837653 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" podStartSLOduration=2.379693923 podStartE2EDuration="2.837626632s" podCreationTimestamp="2025-11-22 09:49:19 +0000 UTC" firstStartedPulling="2025-11-22 09:49:20.884552105 +0000 UTC m=+2135.820241794" lastFinishedPulling="2025-11-22 09:49:21.342484854 +0000 UTC m=+2136.278174503" observedRunningTime="2025-11-22 09:49:21.828327685 +0000 UTC m=+2136.764017374" watchObservedRunningTime="2025-11-22 09:49:21.837626632 +0000 UTC m=+2136.773316311" Nov 22 09:49:57 crc kubenswrapper[4846]: I1122 09:49:57.788799 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2qchd"] Nov 22 09:49:57 crc kubenswrapper[4846]: I1122 09:49:57.791261 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:57 crc kubenswrapper[4846]: I1122 09:49:57.829237 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qchd"] Nov 22 09:49:57 crc kubenswrapper[4846]: I1122 09:49:57.944291 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-catalog-content\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:57 crc kubenswrapper[4846]: I1122 09:49:57.944485 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-utilities\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:57 crc kubenswrapper[4846]: I1122 09:49:57.944534 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65d9d\" (UniqueName: \"kubernetes.io/projected/33085256-f596-4a88-897e-166410540693-kube-api-access-65d9d\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.045676 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-utilities\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.045762 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65d9d\" (UniqueName: \"kubernetes.io/projected/33085256-f596-4a88-897e-166410540693-kube-api-access-65d9d\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.045897 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-catalog-content\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.046745 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-utilities\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.046767 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-catalog-content\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.069555 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65d9d\" (UniqueName: \"kubernetes.io/projected/33085256-f596-4a88-897e-166410540693-kube-api-access-65d9d\") pod \"redhat-marketplace-2qchd\" (UID: \"33085256-f596-4a88-897e-166410540693\") " pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.125229 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:49:58 crc kubenswrapper[4846]: I1122 09:49:58.625281 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qchd"] Nov 22 09:49:59 crc kubenswrapper[4846]: I1122 09:49:59.230715 4846 generic.go:334] "Generic (PLEG): container finished" podID="33085256-f596-4a88-897e-166410540693" containerID="aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f" exitCode=0 Nov 22 09:49:59 crc kubenswrapper[4846]: I1122 09:49:59.230827 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qchd" event={"ID":"33085256-f596-4a88-897e-166410540693","Type":"ContainerDied","Data":"aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f"} Nov 22 09:49:59 crc kubenswrapper[4846]: I1122 09:49:59.230967 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qchd" event={"ID":"33085256-f596-4a88-897e-166410540693","Type":"ContainerStarted","Data":"d5cbdeb85f8f554bcdb9e7c35e88976829ce2129457a1beccbd21e67f1101a8a"} Nov 22 09:50:01 crc kubenswrapper[4846]: I1122 09:50:01.252820 4846 generic.go:334] "Generic (PLEG): container finished" podID="33085256-f596-4a88-897e-166410540693" containerID="6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d" exitCode=0 Nov 22 09:50:01 crc kubenswrapper[4846]: I1122 09:50:01.252956 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qchd" event={"ID":"33085256-f596-4a88-897e-166410540693","Type":"ContainerDied","Data":"6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d"} Nov 22 09:50:02 crc kubenswrapper[4846]: I1122 09:50:02.264995 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qchd" event={"ID":"33085256-f596-4a88-897e-166410540693","Type":"ContainerStarted","Data":"51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959"} Nov 22 09:50:02 crc kubenswrapper[4846]: I1122 09:50:02.300763 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2qchd" podStartSLOduration=2.757408152 podStartE2EDuration="5.300737187s" podCreationTimestamp="2025-11-22 09:49:57 +0000 UTC" firstStartedPulling="2025-11-22 09:49:59.232521951 +0000 UTC m=+2174.168211610" lastFinishedPulling="2025-11-22 09:50:01.775850996 +0000 UTC m=+2176.711540645" observedRunningTime="2025-11-22 09:50:02.289438443 +0000 UTC m=+2177.225128092" watchObservedRunningTime="2025-11-22 09:50:02.300737187 +0000 UTC m=+2177.236426836" Nov 22 09:50:08 crc kubenswrapper[4846]: I1122 09:50:08.126658 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:50:08 crc kubenswrapper[4846]: I1122 09:50:08.127227 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:50:08 crc kubenswrapper[4846]: I1122 09:50:08.189418 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:50:08 crc kubenswrapper[4846]: I1122 09:50:08.404057 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:50:08 crc kubenswrapper[4846]: I1122 09:50:08.462598 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qchd"] Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.356469 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2qchd" podUID="33085256-f596-4a88-897e-166410540693" containerName="registry-server" containerID="cri-o://51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959" gracePeriod=2 Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.828779 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.959131 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65d9d\" (UniqueName: \"kubernetes.io/projected/33085256-f596-4a88-897e-166410540693-kube-api-access-65d9d\") pod \"33085256-f596-4a88-897e-166410540693\" (UID: \"33085256-f596-4a88-897e-166410540693\") " Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.959365 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-catalog-content\") pod \"33085256-f596-4a88-897e-166410540693\" (UID: \"33085256-f596-4a88-897e-166410540693\") " Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.959467 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-utilities\") pod \"33085256-f596-4a88-897e-166410540693\" (UID: \"33085256-f596-4a88-897e-166410540693\") " Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.960930 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-utilities" (OuterVolumeSpecName: "utilities") pod "33085256-f596-4a88-897e-166410540693" (UID: "33085256-f596-4a88-897e-166410540693"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.965587 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33085256-f596-4a88-897e-166410540693-kube-api-access-65d9d" (OuterVolumeSpecName: "kube-api-access-65d9d") pod "33085256-f596-4a88-897e-166410540693" (UID: "33085256-f596-4a88-897e-166410540693"). InnerVolumeSpecName "kube-api-access-65d9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:50:10 crc kubenswrapper[4846]: I1122 09:50:10.988287 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33085256-f596-4a88-897e-166410540693" (UID: "33085256-f596-4a88-897e-166410540693"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.062139 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.062194 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33085256-f596-4a88-897e-166410540693-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.062214 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65d9d\" (UniqueName: \"kubernetes.io/projected/33085256-f596-4a88-897e-166410540693-kube-api-access-65d9d\") on node \"crc\" DevicePath \"\"" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.374897 4846 generic.go:334] "Generic (PLEG): container finished" podID="33085256-f596-4a88-897e-166410540693" containerID="51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959" exitCode=0 Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.375036 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qchd" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.375152 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qchd" event={"ID":"33085256-f596-4a88-897e-166410540693","Type":"ContainerDied","Data":"51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959"} Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.381461 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qchd" event={"ID":"33085256-f596-4a88-897e-166410540693","Type":"ContainerDied","Data":"d5cbdeb85f8f554bcdb9e7c35e88976829ce2129457a1beccbd21e67f1101a8a"} Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.381556 4846 scope.go:117] "RemoveContainer" containerID="51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.436358 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qchd"] Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.446575 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qchd"] Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.449589 4846 scope.go:117] "RemoveContainer" containerID="6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.476556 4846 scope.go:117] "RemoveContainer" containerID="aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.539956 4846 scope.go:117] "RemoveContainer" containerID="51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959" Nov 22 09:50:11 crc kubenswrapper[4846]: E1122 09:50:11.540751 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959\": container with ID starting with 51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959 not found: ID does not exist" containerID="51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.540823 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959"} err="failed to get container status \"51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959\": rpc error: code = NotFound desc = could not find container \"51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959\": container with ID starting with 51b6c62bd71861390ad070c1cf2cacb882ad41aeefabc679e32e7d9862b5c959 not found: ID does not exist" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.540872 4846 scope.go:117] "RemoveContainer" containerID="6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d" Nov 22 09:50:11 crc kubenswrapper[4846]: E1122 09:50:11.541339 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d\": container with ID starting with 6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d not found: ID does not exist" containerID="6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.541378 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d"} err="failed to get container status \"6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d\": rpc error: code = NotFound desc = could not find container \"6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d\": container with ID starting with 6ce468df9d7ad19faf48696e4f07e170f646237ca62b6b7a1083d99528c20c6d not found: ID does not exist" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.541403 4846 scope.go:117] "RemoveContainer" containerID="aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f" Nov 22 09:50:11 crc kubenswrapper[4846]: E1122 09:50:11.541864 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f\": container with ID starting with aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f not found: ID does not exist" containerID="aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f" Nov 22 09:50:11 crc kubenswrapper[4846]: I1122 09:50:11.541931 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f"} err="failed to get container status \"aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f\": rpc error: code = NotFound desc = could not find container \"aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f\": container with ID starting with aa0f0bc653ca051cd16802992821fc8ebcf4826cfdbc3239330cb17d5935af6f not found: ID does not exist" Nov 22 09:50:12 crc kubenswrapper[4846]: I1122 09:50:12.049849 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33085256-f596-4a88-897e-166410540693" path="/var/lib/kubelet/pods/33085256-f596-4a88-897e-166410540693/volumes" Nov 22 09:50:58 crc kubenswrapper[4846]: I1122 09:50:58.625173 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:50:58 crc kubenswrapper[4846]: I1122 09:50:58.625756 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:51:28 crc kubenswrapper[4846]: I1122 09:51:28.625123 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:51:28 crc kubenswrapper[4846]: I1122 09:51:28.625774 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:51:58 crc kubenswrapper[4846]: I1122 09:51:58.625940 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:51:58 crc kubenswrapper[4846]: I1122 09:51:58.626904 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:51:58 crc kubenswrapper[4846]: I1122 09:51:58.626991 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 09:51:58 crc kubenswrapper[4846]: I1122 09:51:58.628763 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:51:58 crc kubenswrapper[4846]: I1122 09:51:58.628909 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" gracePeriod=600 Nov 22 09:51:58 crc kubenswrapper[4846]: E1122 09:51:58.795344 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:51:59 crc kubenswrapper[4846]: I1122 09:51:59.648890 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" exitCode=0 Nov 22 09:51:59 crc kubenswrapper[4846]: I1122 09:51:59.648951 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e"} Nov 22 09:51:59 crc kubenswrapper[4846]: I1122 09:51:59.649038 4846 scope.go:117] "RemoveContainer" containerID="6fddd433fca0a7c496840e53210f130dc975d91035c6bf896d27eba8ebfc15e7" Nov 22 09:51:59 crc kubenswrapper[4846]: I1122 09:51:59.649624 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:51:59 crc kubenswrapper[4846]: E1122 09:51:59.649956 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:52:11 crc kubenswrapper[4846]: I1122 09:52:11.035362 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:52:11 crc kubenswrapper[4846]: E1122 09:52:11.036620 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:52:22 crc kubenswrapper[4846]: I1122 09:52:22.035563 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:52:22 crc kubenswrapper[4846]: E1122 09:52:22.036832 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:52:37 crc kubenswrapper[4846]: I1122 09:52:37.036631 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:52:37 crc kubenswrapper[4846]: E1122 09:52:37.037934 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:52:51 crc kubenswrapper[4846]: I1122 09:52:51.035980 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:52:51 crc kubenswrapper[4846]: E1122 09:52:51.036994 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:53:04 crc kubenswrapper[4846]: I1122 09:53:04.035182 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:53:04 crc kubenswrapper[4846]: E1122 09:53:04.036364 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:53:17 crc kubenswrapper[4846]: I1122 09:53:17.036990 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:53:17 crc kubenswrapper[4846]: E1122 09:53:17.037949 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:53:29 crc kubenswrapper[4846]: I1122 09:53:29.036219 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:53:29 crc kubenswrapper[4846]: E1122 09:53:29.037638 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:53:42 crc kubenswrapper[4846]: I1122 09:53:42.037634 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:53:42 crc kubenswrapper[4846]: E1122 09:53:42.039090 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:53:53 crc kubenswrapper[4846]: I1122 09:53:53.035341 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:53:53 crc kubenswrapper[4846]: E1122 09:53:53.036260 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:54:05 crc kubenswrapper[4846]: I1122 09:54:05.023265 4846 generic.go:334] "Generic (PLEG): container finished" podID="06a4ae02-37d7-458b-879a-64951da9e75a" containerID="c8643b7b278e09d64d92bf1713695d7dbc4d167175a0bf0703d32bf75c4fc701" exitCode=0 Nov 22 09:54:05 crc kubenswrapper[4846]: I1122 09:54:05.023375 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" event={"ID":"06a4ae02-37d7-458b-879a-64951da9e75a","Type":"ContainerDied","Data":"c8643b7b278e09d64d92bf1713695d7dbc4d167175a0bf0703d32bf75c4fc701"} Nov 22 09:54:05 crc kubenswrapper[4846]: I1122 09:54:05.035795 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:54:05 crc kubenswrapper[4846]: E1122 09:54:05.036354 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.470898 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.604816 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-secret-0\") pod \"06a4ae02-37d7-458b-879a-64951da9e75a\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.604886 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt757\" (UniqueName: \"kubernetes.io/projected/06a4ae02-37d7-458b-879a-64951da9e75a-kube-api-access-tt757\") pod \"06a4ae02-37d7-458b-879a-64951da9e75a\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.604940 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-ssh-key\") pod \"06a4ae02-37d7-458b-879a-64951da9e75a\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.605015 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-inventory\") pod \"06a4ae02-37d7-458b-879a-64951da9e75a\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.605235 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-combined-ca-bundle\") pod \"06a4ae02-37d7-458b-879a-64951da9e75a\" (UID: \"06a4ae02-37d7-458b-879a-64951da9e75a\") " Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.610942 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "06a4ae02-37d7-458b-879a-64951da9e75a" (UID: "06a4ae02-37d7-458b-879a-64951da9e75a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.615150 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a4ae02-37d7-458b-879a-64951da9e75a-kube-api-access-tt757" (OuterVolumeSpecName: "kube-api-access-tt757") pod "06a4ae02-37d7-458b-879a-64951da9e75a" (UID: "06a4ae02-37d7-458b-879a-64951da9e75a"). InnerVolumeSpecName "kube-api-access-tt757". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.633313 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "06a4ae02-37d7-458b-879a-64951da9e75a" (UID: "06a4ae02-37d7-458b-879a-64951da9e75a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.642622 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-inventory" (OuterVolumeSpecName: "inventory") pod "06a4ae02-37d7-458b-879a-64951da9e75a" (UID: "06a4ae02-37d7-458b-879a-64951da9e75a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.646219 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06a4ae02-37d7-458b-879a-64951da9e75a" (UID: "06a4ae02-37d7-458b-879a-64951da9e75a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.707462 4846 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.707505 4846 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.707518 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt757\" (UniqueName: \"kubernetes.io/projected/06a4ae02-37d7-458b-879a-64951da9e75a-kube-api-access-tt757\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.707531 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:06 crc kubenswrapper[4846]: I1122 09:54:06.707547 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06a4ae02-37d7-458b-879a-64951da9e75a-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.047881 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" event={"ID":"06a4ae02-37d7-458b-879a-64951da9e75a","Type":"ContainerDied","Data":"0d91e2b688a11b7aa82eea57888224907c00a1319840d9b5a9457d792ef71926"} Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.047925 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d91e2b688a11b7aa82eea57888224907c00a1319840d9b5a9457d792ef71926" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.048016 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.156955 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g"] Nov 22 09:54:07 crc kubenswrapper[4846]: E1122 09:54:07.157861 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33085256-f596-4a88-897e-166410540693" containerName="registry-server" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.157885 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="33085256-f596-4a88-897e-166410540693" containerName="registry-server" Nov 22 09:54:07 crc kubenswrapper[4846]: E1122 09:54:07.157908 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a4ae02-37d7-458b-879a-64951da9e75a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.157919 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a4ae02-37d7-458b-879a-64951da9e75a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 09:54:07 crc kubenswrapper[4846]: E1122 09:54:07.157953 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33085256-f596-4a88-897e-166410540693" containerName="extract-content" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.157961 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="33085256-f596-4a88-897e-166410540693" containerName="extract-content" Nov 22 09:54:07 crc kubenswrapper[4846]: E1122 09:54:07.157992 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33085256-f596-4a88-897e-166410540693" containerName="extract-utilities" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.158001 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="33085256-f596-4a88-897e-166410540693" containerName="extract-utilities" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.158247 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a4ae02-37d7-458b-879a-64951da9e75a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.158269 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="33085256-f596-4a88-897e-166410540693" containerName="registry-server" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.159008 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.162577 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.162701 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.163084 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.163348 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.163509 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.163673 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.164078 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.169489 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g"] Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320551 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320624 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320662 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkc9g\" (UniqueName: \"kubernetes.io/projected/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-kube-api-access-gkc9g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320688 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320769 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320809 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320852 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320898 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.320930 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.422704 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.422808 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.422899 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.423007 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.423125 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.423299 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.423387 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.423455 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkc9g\" (UniqueName: \"kubernetes.io/projected/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-kube-api-access-gkc9g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.423509 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.425686 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.427111 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.430275 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.431493 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.431913 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.432520 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.434149 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.435823 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.446982 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkc9g\" (UniqueName: \"kubernetes.io/projected/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-kube-api-access-gkc9g\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nl69g\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:07 crc kubenswrapper[4846]: I1122 09:54:07.483005 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:54:08 crc kubenswrapper[4846]: I1122 09:54:08.029159 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g"] Nov 22 09:54:08 crc kubenswrapper[4846]: I1122 09:54:08.064644 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:54:08 crc kubenswrapper[4846]: I1122 09:54:08.076501 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" event={"ID":"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417","Type":"ContainerStarted","Data":"d654487484ef7b3f0829de1c3dc059bca86cdf7d7f01312327b1db57dc1b489e"} Nov 22 09:54:10 crc kubenswrapper[4846]: I1122 09:54:10.095769 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" event={"ID":"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417","Type":"ContainerStarted","Data":"6efead6114076a4848f68a829bf19fcd3f643a5a587178aa38d2e52eb534cc83"} Nov 22 09:54:10 crc kubenswrapper[4846]: I1122 09:54:10.114598 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" podStartSLOduration=1.6048024779999999 podStartE2EDuration="3.114573566s" podCreationTimestamp="2025-11-22 09:54:07 +0000 UTC" firstStartedPulling="2025-11-22 09:54:08.064312092 +0000 UTC m=+2423.000001751" lastFinishedPulling="2025-11-22 09:54:09.5740832 +0000 UTC m=+2424.509772839" observedRunningTime="2025-11-22 09:54:10.110795017 +0000 UTC m=+2425.046484676" watchObservedRunningTime="2025-11-22 09:54:10.114573566 +0000 UTC m=+2425.050263215" Nov 22 09:54:18 crc kubenswrapper[4846]: I1122 09:54:18.036667 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:54:18 crc kubenswrapper[4846]: E1122 09:54:18.038027 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:54:33 crc kubenswrapper[4846]: I1122 09:54:33.035308 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:54:33 crc kubenswrapper[4846]: E1122 09:54:33.036356 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:54:44 crc kubenswrapper[4846]: I1122 09:54:44.036367 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:54:44 crc kubenswrapper[4846]: E1122 09:54:44.037857 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:54:58 crc kubenswrapper[4846]: I1122 09:54:58.035250 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:54:58 crc kubenswrapper[4846]: E1122 09:54:58.036057 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:55:11 crc kubenswrapper[4846]: I1122 09:55:11.035415 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:55:11 crc kubenswrapper[4846]: E1122 09:55:11.036189 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:55:24 crc kubenswrapper[4846]: I1122 09:55:24.035858 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:55:24 crc kubenswrapper[4846]: E1122 09:55:24.036763 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.340562 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pn4zz"] Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.351861 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.355944 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn4zz"] Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.366987 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-catalog-content\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.367301 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c57b\" (UniqueName: \"kubernetes.io/projected/ee43ad9f-66cc-4926-a407-130fde510903-kube-api-access-6c57b\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.367544 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-utilities\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.469078 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-utilities\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.469162 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-catalog-content\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.469243 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c57b\" (UniqueName: \"kubernetes.io/projected/ee43ad9f-66cc-4926-a407-130fde510903-kube-api-access-6c57b\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.469742 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-utilities\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.469806 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-catalog-content\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.500099 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c57b\" (UniqueName: \"kubernetes.io/projected/ee43ad9f-66cc-4926-a407-130fde510903-kube-api-access-6c57b\") pod \"certified-operators-pn4zz\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.683147 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.950489 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dgxwp"] Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.952901 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.969298 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dgxwp"] Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.982017 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-utilities\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.982247 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wn6\" (UniqueName: \"kubernetes.io/projected/9a49b9b1-eb41-4087-b860-a9eadc192718-kube-api-access-t6wn6\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:34 crc kubenswrapper[4846]: I1122 09:55:34.982282 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-catalog-content\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.083787 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wn6\" (UniqueName: \"kubernetes.io/projected/9a49b9b1-eb41-4087-b860-a9eadc192718-kube-api-access-t6wn6\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.083835 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-catalog-content\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.084254 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-catalog-content\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.084406 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-utilities\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.084917 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-utilities\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.116588 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wn6\" (UniqueName: \"kubernetes.io/projected/9a49b9b1-eb41-4087-b860-a9eadc192718-kube-api-access-t6wn6\") pod \"community-operators-dgxwp\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.255000 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn4zz"] Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.291390 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.370466 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn4zz" event={"ID":"ee43ad9f-66cc-4926-a407-130fde510903","Type":"ContainerStarted","Data":"920a17fbc272de0c025ab588fb3fadd57d63c50bb65e250271cafd579929acfc"} Nov 22 09:55:35 crc kubenswrapper[4846]: I1122 09:55:35.787615 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dgxwp"] Nov 22 09:55:36 crc kubenswrapper[4846]: I1122 09:55:36.382754 4846 generic.go:334] "Generic (PLEG): container finished" podID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerID="04aef05ebce552e8082be4f69d1b86e359d5f3dd0b41105abed4a5b5053dc7fc" exitCode=0 Nov 22 09:55:36 crc kubenswrapper[4846]: I1122 09:55:36.382825 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgxwp" event={"ID":"9a49b9b1-eb41-4087-b860-a9eadc192718","Type":"ContainerDied","Data":"04aef05ebce552e8082be4f69d1b86e359d5f3dd0b41105abed4a5b5053dc7fc"} Nov 22 09:55:36 crc kubenswrapper[4846]: I1122 09:55:36.383312 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgxwp" event={"ID":"9a49b9b1-eb41-4087-b860-a9eadc192718","Type":"ContainerStarted","Data":"982497fe0399a690d4cfca75a2f5ff18c4863a040eb5bfdfd24011f07d8bb101"} Nov 22 09:55:36 crc kubenswrapper[4846]: I1122 09:55:36.385250 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee43ad9f-66cc-4926-a407-130fde510903" containerID="55fdfeca00ca6be394c78e20ba1e1756111e1bbc010804f4c0ea52fd5755d114" exitCode=0 Nov 22 09:55:36 crc kubenswrapper[4846]: I1122 09:55:36.385287 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn4zz" event={"ID":"ee43ad9f-66cc-4926-a407-130fde510903","Type":"ContainerDied","Data":"55fdfeca00ca6be394c78e20ba1e1756111e1bbc010804f4c0ea52fd5755d114"} Nov 22 09:55:38 crc kubenswrapper[4846]: I1122 09:55:38.034847 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:55:38 crc kubenswrapper[4846]: E1122 09:55:38.035426 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:55:39 crc kubenswrapper[4846]: I1122 09:55:39.409917 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee43ad9f-66cc-4926-a407-130fde510903" containerID="5e6ba253048773ef9ac63f701d5ceea8ca0bbd39b83d4ea973dfcb40dca2ee4c" exitCode=0 Nov 22 09:55:39 crc kubenswrapper[4846]: I1122 09:55:39.410149 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn4zz" event={"ID":"ee43ad9f-66cc-4926-a407-130fde510903","Type":"ContainerDied","Data":"5e6ba253048773ef9ac63f701d5ceea8ca0bbd39b83d4ea973dfcb40dca2ee4c"} Nov 22 09:55:39 crc kubenswrapper[4846]: I1122 09:55:39.412742 4846 generic.go:334] "Generic (PLEG): container finished" podID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerID="786fb8844703e8aa97a6b98cc1f718cd5ba875b54f7c0adac586b1867f71d1ac" exitCode=0 Nov 22 09:55:39 crc kubenswrapper[4846]: I1122 09:55:39.412778 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgxwp" event={"ID":"9a49b9b1-eb41-4087-b860-a9eadc192718","Type":"ContainerDied","Data":"786fb8844703e8aa97a6b98cc1f718cd5ba875b54f7c0adac586b1867f71d1ac"} Nov 22 09:55:40 crc kubenswrapper[4846]: I1122 09:55:40.438248 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgxwp" event={"ID":"9a49b9b1-eb41-4087-b860-a9eadc192718","Type":"ContainerStarted","Data":"dc7f97f3be98330f37f96ebda6dc33c75f17dfe49f072cdc1c3d7e99623ee415"} Nov 22 09:55:40 crc kubenswrapper[4846]: I1122 09:55:40.442484 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn4zz" event={"ID":"ee43ad9f-66cc-4926-a407-130fde510903","Type":"ContainerStarted","Data":"f67a50ceaee78adc6cd55f0e3c8955bf43ae47b038a7db000b8675e9a526d9e0"} Nov 22 09:55:40 crc kubenswrapper[4846]: I1122 09:55:40.466369 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dgxwp" podStartSLOduration=2.729298921 podStartE2EDuration="6.466348472s" podCreationTimestamp="2025-11-22 09:55:34 +0000 UTC" firstStartedPulling="2025-11-22 09:55:36.38754631 +0000 UTC m=+2511.323235969" lastFinishedPulling="2025-11-22 09:55:40.124595871 +0000 UTC m=+2515.060285520" observedRunningTime="2025-11-22 09:55:40.46216124 +0000 UTC m=+2515.397850899" watchObservedRunningTime="2025-11-22 09:55:40.466348472 +0000 UTC m=+2515.402038131" Nov 22 09:55:40 crc kubenswrapper[4846]: I1122 09:55:40.490749 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pn4zz" podStartSLOduration=2.847312732 podStartE2EDuration="6.490720836s" podCreationTimestamp="2025-11-22 09:55:34 +0000 UTC" firstStartedPulling="2025-11-22 09:55:36.387732685 +0000 UTC m=+2511.323422374" lastFinishedPulling="2025-11-22 09:55:40.031140789 +0000 UTC m=+2514.966830478" observedRunningTime="2025-11-22 09:55:40.480858541 +0000 UTC m=+2515.416548190" watchObservedRunningTime="2025-11-22 09:55:40.490720836 +0000 UTC m=+2515.426410495" Nov 22 09:55:44 crc kubenswrapper[4846]: I1122 09:55:44.683973 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:44 crc kubenswrapper[4846]: I1122 09:55:44.684992 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:44 crc kubenswrapper[4846]: I1122 09:55:44.738623 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:45 crc kubenswrapper[4846]: I1122 09:55:45.291962 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:45 crc kubenswrapper[4846]: I1122 09:55:45.292125 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:45 crc kubenswrapper[4846]: I1122 09:55:45.369684 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:45 crc kubenswrapper[4846]: I1122 09:55:45.535305 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:45 crc kubenswrapper[4846]: I1122 09:55:45.566330 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:46 crc kubenswrapper[4846]: I1122 09:55:46.128555 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dgxwp"] Nov 22 09:55:47 crc kubenswrapper[4846]: I1122 09:55:47.513825 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dgxwp" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="registry-server" containerID="cri-o://dc7f97f3be98330f37f96ebda6dc33c75f17dfe49f072cdc1c3d7e99623ee415" gracePeriod=2 Nov 22 09:55:47 crc kubenswrapper[4846]: I1122 09:55:47.930537 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn4zz"] Nov 22 09:55:47 crc kubenswrapper[4846]: I1122 09:55:47.932244 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pn4zz" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="registry-server" containerID="cri-o://f67a50ceaee78adc6cd55f0e3c8955bf43ae47b038a7db000b8675e9a526d9e0" gracePeriod=2 Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.526970 4846 generic.go:334] "Generic (PLEG): container finished" podID="ee43ad9f-66cc-4926-a407-130fde510903" containerID="f67a50ceaee78adc6cd55f0e3c8955bf43ae47b038a7db000b8675e9a526d9e0" exitCode=0 Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.527083 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn4zz" event={"ID":"ee43ad9f-66cc-4926-a407-130fde510903","Type":"ContainerDied","Data":"f67a50ceaee78adc6cd55f0e3c8955bf43ae47b038a7db000b8675e9a526d9e0"} Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.527595 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn4zz" event={"ID":"ee43ad9f-66cc-4926-a407-130fde510903","Type":"ContainerDied","Data":"920a17fbc272de0c025ab588fb3fadd57d63c50bb65e250271cafd579929acfc"} Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.527623 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920a17fbc272de0c025ab588fb3fadd57d63c50bb65e250271cafd579929acfc" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.528196 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.531675 4846 generic.go:334] "Generic (PLEG): container finished" podID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerID="dc7f97f3be98330f37f96ebda6dc33c75f17dfe49f072cdc1c3d7e99623ee415" exitCode=0 Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.531757 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgxwp" event={"ID":"9a49b9b1-eb41-4087-b860-a9eadc192718","Type":"ContainerDied","Data":"dc7f97f3be98330f37f96ebda6dc33c75f17dfe49f072cdc1c3d7e99623ee415"} Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.531803 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgxwp" event={"ID":"9a49b9b1-eb41-4087-b860-a9eadc192718","Type":"ContainerDied","Data":"982497fe0399a690d4cfca75a2f5ff18c4863a040eb5bfdfd24011f07d8bb101"} Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.531822 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="982497fe0399a690d4cfca75a2f5ff18c4863a040eb5bfdfd24011f07d8bb101" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.533382 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.663526 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6wn6\" (UniqueName: \"kubernetes.io/projected/9a49b9b1-eb41-4087-b860-a9eadc192718-kube-api-access-t6wn6\") pod \"9a49b9b1-eb41-4087-b860-a9eadc192718\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.663602 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-catalog-content\") pod \"9a49b9b1-eb41-4087-b860-a9eadc192718\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.663651 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-utilities\") pod \"ee43ad9f-66cc-4926-a407-130fde510903\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.663670 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-catalog-content\") pod \"ee43ad9f-66cc-4926-a407-130fde510903\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.663814 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c57b\" (UniqueName: \"kubernetes.io/projected/ee43ad9f-66cc-4926-a407-130fde510903-kube-api-access-6c57b\") pod \"ee43ad9f-66cc-4926-a407-130fde510903\" (UID: \"ee43ad9f-66cc-4926-a407-130fde510903\") " Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.663928 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-utilities\") pod \"9a49b9b1-eb41-4087-b860-a9eadc192718\" (UID: \"9a49b9b1-eb41-4087-b860-a9eadc192718\") " Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.664489 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-utilities" (OuterVolumeSpecName: "utilities") pod "ee43ad9f-66cc-4926-a407-130fde510903" (UID: "ee43ad9f-66cc-4926-a407-130fde510903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.665580 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-utilities" (OuterVolumeSpecName: "utilities") pod "9a49b9b1-eb41-4087-b860-a9eadc192718" (UID: "9a49b9b1-eb41-4087-b860-a9eadc192718"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.669653 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a49b9b1-eb41-4087-b860-a9eadc192718-kube-api-access-t6wn6" (OuterVolumeSpecName: "kube-api-access-t6wn6") pod "9a49b9b1-eb41-4087-b860-a9eadc192718" (UID: "9a49b9b1-eb41-4087-b860-a9eadc192718"). InnerVolumeSpecName "kube-api-access-t6wn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.669765 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee43ad9f-66cc-4926-a407-130fde510903-kube-api-access-6c57b" (OuterVolumeSpecName: "kube-api-access-6c57b") pod "ee43ad9f-66cc-4926-a407-130fde510903" (UID: "ee43ad9f-66cc-4926-a407-130fde510903"). InnerVolumeSpecName "kube-api-access-6c57b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.726426 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a49b9b1-eb41-4087-b860-a9eadc192718" (UID: "9a49b9b1-eb41-4087-b860-a9eadc192718"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.734835 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee43ad9f-66cc-4926-a407-130fde510903" (UID: "ee43ad9f-66cc-4926-a407-130fde510903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.766743 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c57b\" (UniqueName: \"kubernetes.io/projected/ee43ad9f-66cc-4926-a407-130fde510903-kube-api-access-6c57b\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.766794 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.766808 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6wn6\" (UniqueName: \"kubernetes.io/projected/9a49b9b1-eb41-4087-b860-a9eadc192718-kube-api-access-t6wn6\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.766817 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a49b9b1-eb41-4087-b860-a9eadc192718-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.766827 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4846]: I1122 09:55:48.766836 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43ad9f-66cc-4926-a407-130fde510903-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:49 crc kubenswrapper[4846]: I1122 09:55:49.540382 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn4zz" Nov 22 09:55:49 crc kubenswrapper[4846]: I1122 09:55:49.540442 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgxwp" Nov 22 09:55:49 crc kubenswrapper[4846]: I1122 09:55:49.600628 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn4zz"] Nov 22 09:55:49 crc kubenswrapper[4846]: I1122 09:55:49.612175 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pn4zz"] Nov 22 09:55:49 crc kubenswrapper[4846]: I1122 09:55:49.620976 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dgxwp"] Nov 22 09:55:49 crc kubenswrapper[4846]: I1122 09:55:49.630497 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dgxwp"] Nov 22 09:55:50 crc kubenswrapper[4846]: I1122 09:55:50.035407 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:55:50 crc kubenswrapper[4846]: E1122 09:55:50.035759 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:55:50 crc kubenswrapper[4846]: I1122 09:55:50.048726 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" path="/var/lib/kubelet/pods/9a49b9b1-eb41-4087-b860-a9eadc192718/volumes" Nov 22 09:55:50 crc kubenswrapper[4846]: I1122 09:55:50.049981 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee43ad9f-66cc-4926-a407-130fde510903" path="/var/lib/kubelet/pods/ee43ad9f-66cc-4926-a407-130fde510903/volumes" Nov 22 09:56:03 crc kubenswrapper[4846]: I1122 09:56:03.035118 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:56:03 crc kubenswrapper[4846]: E1122 09:56:03.036431 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:56:18 crc kubenswrapper[4846]: I1122 09:56:18.036340 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:56:18 crc kubenswrapper[4846]: E1122 09:56:18.039441 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:56:31 crc kubenswrapper[4846]: I1122 09:56:31.036238 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:56:31 crc kubenswrapper[4846]: E1122 09:56:31.037178 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:56:44 crc kubenswrapper[4846]: I1122 09:56:44.035981 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:56:44 crc kubenswrapper[4846]: E1122 09:56:44.038136 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:56:55 crc kubenswrapper[4846]: I1122 09:56:55.035102 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:56:55 crc kubenswrapper[4846]: E1122 09:56:55.036072 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.217728 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lxgxg"] Nov 22 09:57:04 crc kubenswrapper[4846]: E1122 09:57:04.218594 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="registry-server" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218607 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="registry-server" Nov 22 09:57:04 crc kubenswrapper[4846]: E1122 09:57:04.218632 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="extract-content" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218638 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="extract-content" Nov 22 09:57:04 crc kubenswrapper[4846]: E1122 09:57:04.218646 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="extract-utilities" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218653 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="extract-utilities" Nov 22 09:57:04 crc kubenswrapper[4846]: E1122 09:57:04.218667 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="extract-utilities" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218674 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="extract-utilities" Nov 22 09:57:04 crc kubenswrapper[4846]: E1122 09:57:04.218691 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="extract-content" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218697 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="extract-content" Nov 22 09:57:04 crc kubenswrapper[4846]: E1122 09:57:04.218711 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="registry-server" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218718 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="registry-server" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218878 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee43ad9f-66cc-4926-a407-130fde510903" containerName="registry-server" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.218893 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a49b9b1-eb41-4087-b860-a9eadc192718" containerName="registry-server" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.220176 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.231896 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxgxg"] Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.340802 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-catalog-content\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.341277 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-utilities\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.341324 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98zb\" (UniqueName: \"kubernetes.io/projected/28dfe5cb-979c-441e-917b-8d7763919cee-kube-api-access-t98zb\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.443701 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-utilities\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.443780 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t98zb\" (UniqueName: \"kubernetes.io/projected/28dfe5cb-979c-441e-917b-8d7763919cee-kube-api-access-t98zb\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.443858 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-catalog-content\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.444426 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-catalog-content\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.444551 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-utilities\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.467126 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t98zb\" (UniqueName: \"kubernetes.io/projected/28dfe5cb-979c-441e-917b-8d7763919cee-kube-api-access-t98zb\") pod \"redhat-operators-lxgxg\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:04 crc kubenswrapper[4846]: I1122 09:57:04.539795 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:05 crc kubenswrapper[4846]: I1122 09:57:05.065411 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxgxg"] Nov 22 09:57:05 crc kubenswrapper[4846]: I1122 09:57:05.316889 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerStarted","Data":"feeb37258790a48248462dcade89dfbcd449be92f886ae389e5224b45f2f81ce"} Nov 22 09:57:06 crc kubenswrapper[4846]: I1122 09:57:06.340022 4846 generic.go:334] "Generic (PLEG): container finished" podID="28dfe5cb-979c-441e-917b-8d7763919cee" containerID="02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8" exitCode=0 Nov 22 09:57:06 crc kubenswrapper[4846]: I1122 09:57:06.340206 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerDied","Data":"02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8"} Nov 22 09:57:07 crc kubenswrapper[4846]: I1122 09:57:07.353931 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerStarted","Data":"b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41"} Nov 22 09:57:08 crc kubenswrapper[4846]: I1122 09:57:08.365838 4846 generic.go:334] "Generic (PLEG): container finished" podID="28dfe5cb-979c-441e-917b-8d7763919cee" containerID="b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41" exitCode=0 Nov 22 09:57:08 crc kubenswrapper[4846]: I1122 09:57:08.365910 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerDied","Data":"b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41"} Nov 22 09:57:08 crc kubenswrapper[4846]: I1122 09:57:08.371009 4846 generic.go:334] "Generic (PLEG): container finished" podID="e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" containerID="6efead6114076a4848f68a829bf19fcd3f643a5a587178aa38d2e52eb534cc83" exitCode=0 Nov 22 09:57:08 crc kubenswrapper[4846]: I1122 09:57:08.371068 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" event={"ID":"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417","Type":"ContainerDied","Data":"6efead6114076a4848f68a829bf19fcd3f643a5a587178aa38d2e52eb534cc83"} Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.781191 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.961572 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-extra-config-0\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.961776 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-0\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.961813 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-ssh-key\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.961877 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-0\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.961925 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-1\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.962001 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-1\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.962069 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-inventory\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.962196 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-combined-ca-bundle\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.962230 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkc9g\" (UniqueName: \"kubernetes.io/projected/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-kube-api-access-gkc9g\") pod \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\" (UID: \"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417\") " Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.968144 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.981757 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-kube-api-access-gkc9g" (OuterVolumeSpecName: "kube-api-access-gkc9g") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "kube-api-access-gkc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.991223 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.991239 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:09 crc kubenswrapper[4846]: I1122 09:57:09.993841 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:09.999683 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.001350 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.006466 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-inventory" (OuterVolumeSpecName: "inventory") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.010780 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" (UID: "e51d5d70-b3f1-41e3-b6c4-f3bf9b569417"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.035811 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065199 4846 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065240 4846 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065254 4846 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065266 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065281 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkc9g\" (UniqueName: \"kubernetes.io/projected/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-kube-api-access-gkc9g\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065293 4846 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065305 4846 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065317 4846 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.065329 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51d5d70-b3f1-41e3-b6c4-f3bf9b569417-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.401831 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"ce3d9c51232ef494dd6d2c6997940a02b57399e9a6de1afb40010fd82a108cf3"} Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.403995 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" event={"ID":"e51d5d70-b3f1-41e3-b6c4-f3bf9b569417","Type":"ContainerDied","Data":"d654487484ef7b3f0829de1c3dc059bca86cdf7d7f01312327b1db57dc1b489e"} Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.404090 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d654487484ef7b3f0829de1c3dc059bca86cdf7d7f01312327b1db57dc1b489e" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.404341 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nl69g" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.407017 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerStarted","Data":"03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4"} Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.535777 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lxgxg" podStartSLOduration=3.6428872119999998 podStartE2EDuration="6.53575546s" podCreationTimestamp="2025-11-22 09:57:04 +0000 UTC" firstStartedPulling="2025-11-22 09:57:06.343093989 +0000 UTC m=+2601.278783648" lastFinishedPulling="2025-11-22 09:57:09.235962247 +0000 UTC m=+2604.171651896" observedRunningTime="2025-11-22 09:57:10.484861739 +0000 UTC m=+2605.420551398" watchObservedRunningTime="2025-11-22 09:57:10.53575546 +0000 UTC m=+2605.471445109" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.545860 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm"] Nov 22 09:57:10 crc kubenswrapper[4846]: E1122 09:57:10.567576 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.567620 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.567815 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51d5d70-b3f1-41e3-b6c4-f3bf9b569417" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.568449 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm"] Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.568537 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.570712 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.570891 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.574563 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6pprd" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.575878 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.576358 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590066 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590129 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590278 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590304 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590386 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5m9\" (UniqueName: \"kubernetes.io/projected/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-kube-api-access-bh5m9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590430 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.590483 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692344 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692423 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692514 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692541 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692596 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5m9\" (UniqueName: \"kubernetes.io/projected/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-kube-api-access-bh5m9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692641 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.692684 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.705075 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.705153 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.705609 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.707733 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.710575 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5m9\" (UniqueName: \"kubernetes.io/projected/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-kube-api-access-bh5m9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.710572 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.712591 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:10 crc kubenswrapper[4846]: I1122 09:57:10.892101 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:57:11 crc kubenswrapper[4846]: W1122 09:57:11.474539 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fa86a5b_2dbc_4e12_bf49_ea58d02854b0.slice/crio-446e78d1d39d880bc5d6a9af240df8ae7442dfc8ef206749652263ffb4fba6eb WatchSource:0}: Error finding container 446e78d1d39d880bc5d6a9af240df8ae7442dfc8ef206749652263ffb4fba6eb: Status 404 returned error can't find the container with id 446e78d1d39d880bc5d6a9af240df8ae7442dfc8ef206749652263ffb4fba6eb Nov 22 09:57:11 crc kubenswrapper[4846]: I1122 09:57:11.475027 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm"] Nov 22 09:57:12 crc kubenswrapper[4846]: I1122 09:57:12.426061 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" event={"ID":"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0","Type":"ContainerStarted","Data":"ea31847df74b0aba282e707255d5ff90fd02f202d5fe2a7ad60f455a1eee4441"} Nov 22 09:57:12 crc kubenswrapper[4846]: I1122 09:57:12.426512 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" event={"ID":"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0","Type":"ContainerStarted","Data":"446e78d1d39d880bc5d6a9af240df8ae7442dfc8ef206749652263ffb4fba6eb"} Nov 22 09:57:12 crc kubenswrapper[4846]: I1122 09:57:12.452904 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" podStartSLOduration=2.042618612 podStartE2EDuration="2.452879633s" podCreationTimestamp="2025-11-22 09:57:10 +0000 UTC" firstStartedPulling="2025-11-22 09:57:11.478988481 +0000 UTC m=+2606.414678130" lastFinishedPulling="2025-11-22 09:57:11.889249492 +0000 UTC m=+2606.824939151" observedRunningTime="2025-11-22 09:57:12.442094703 +0000 UTC m=+2607.377784362" watchObservedRunningTime="2025-11-22 09:57:12.452879633 +0000 UTC m=+2607.388569282" Nov 22 09:57:14 crc kubenswrapper[4846]: I1122 09:57:14.541687 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:14 crc kubenswrapper[4846]: I1122 09:57:14.542165 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:15 crc kubenswrapper[4846]: I1122 09:57:15.623952 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lxgxg" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="registry-server" probeResult="failure" output=< Nov 22 09:57:15 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 09:57:15 crc kubenswrapper[4846]: > Nov 22 09:57:24 crc kubenswrapper[4846]: I1122 09:57:24.628088 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:24 crc kubenswrapper[4846]: I1122 09:57:24.701970 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:24 crc kubenswrapper[4846]: I1122 09:57:24.886274 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxgxg"] Nov 22 09:57:26 crc kubenswrapper[4846]: I1122 09:57:26.582863 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lxgxg" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="registry-server" containerID="cri-o://03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4" gracePeriod=2 Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.048264 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.168764 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t98zb\" (UniqueName: \"kubernetes.io/projected/28dfe5cb-979c-441e-917b-8d7763919cee-kube-api-access-t98zb\") pod \"28dfe5cb-979c-441e-917b-8d7763919cee\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.168862 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-utilities\") pod \"28dfe5cb-979c-441e-917b-8d7763919cee\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.169778 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-utilities" (OuterVolumeSpecName: "utilities") pod "28dfe5cb-979c-441e-917b-8d7763919cee" (UID: "28dfe5cb-979c-441e-917b-8d7763919cee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.169860 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-catalog-content\") pod \"28dfe5cb-979c-441e-917b-8d7763919cee\" (UID: \"28dfe5cb-979c-441e-917b-8d7763919cee\") " Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.172426 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.174265 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28dfe5cb-979c-441e-917b-8d7763919cee-kube-api-access-t98zb" (OuterVolumeSpecName: "kube-api-access-t98zb") pod "28dfe5cb-979c-441e-917b-8d7763919cee" (UID: "28dfe5cb-979c-441e-917b-8d7763919cee"). InnerVolumeSpecName "kube-api-access-t98zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.264476 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28dfe5cb-979c-441e-917b-8d7763919cee" (UID: "28dfe5cb-979c-441e-917b-8d7763919cee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.274102 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28dfe5cb-979c-441e-917b-8d7763919cee-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.274144 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t98zb\" (UniqueName: \"kubernetes.io/projected/28dfe5cb-979c-441e-917b-8d7763919cee-kube-api-access-t98zb\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.598401 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerDied","Data":"03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4"} Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.598460 4846 scope.go:117] "RemoveContainer" containerID="03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.598400 4846 generic.go:334] "Generic (PLEG): container finished" podID="28dfe5cb-979c-441e-917b-8d7763919cee" containerID="03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4" exitCode=0 Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.598511 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgxg" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.598541 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgxg" event={"ID":"28dfe5cb-979c-441e-917b-8d7763919cee","Type":"ContainerDied","Data":"feeb37258790a48248462dcade89dfbcd449be92f886ae389e5224b45f2f81ce"} Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.620435 4846 scope.go:117] "RemoveContainer" containerID="b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.650822 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxgxg"] Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.660438 4846 scope.go:117] "RemoveContainer" containerID="02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.668735 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lxgxg"] Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.708393 4846 scope.go:117] "RemoveContainer" containerID="03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4" Nov 22 09:57:27 crc kubenswrapper[4846]: E1122 09:57:27.708868 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4\": container with ID starting with 03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4 not found: ID does not exist" containerID="03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.708910 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4"} err="failed to get container status \"03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4\": rpc error: code = NotFound desc = could not find container \"03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4\": container with ID starting with 03bd1e6cd67830212bce8f09185771d83aed2af5b4f58df5e11eabb25b04aab4 not found: ID does not exist" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.708938 4846 scope.go:117] "RemoveContainer" containerID="b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41" Nov 22 09:57:27 crc kubenswrapper[4846]: E1122 09:57:27.709258 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41\": container with ID starting with b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41 not found: ID does not exist" containerID="b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.709303 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41"} err="failed to get container status \"b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41\": rpc error: code = NotFound desc = could not find container \"b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41\": container with ID starting with b9d1694012526c3d22ef1ce6683eaecfc307405f6216fefd7fe5536534481f41 not found: ID does not exist" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.709329 4846 scope.go:117] "RemoveContainer" containerID="02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8" Nov 22 09:57:27 crc kubenswrapper[4846]: E1122 09:57:27.709586 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8\": container with ID starting with 02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8 not found: ID does not exist" containerID="02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8" Nov 22 09:57:27 crc kubenswrapper[4846]: I1122 09:57:27.709615 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8"} err="failed to get container status \"02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8\": rpc error: code = NotFound desc = could not find container \"02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8\": container with ID starting with 02d78cc28511f859b39c50552b105798917c0173a7d8a64a9f9aecd32fcb2ea8 not found: ID does not exist" Nov 22 09:57:28 crc kubenswrapper[4846]: I1122 09:57:28.055132 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" path="/var/lib/kubelet/pods/28dfe5cb-979c-441e-917b-8d7763919cee/volumes" Nov 22 09:59:28 crc kubenswrapper[4846]: I1122 09:59:28.627359 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:59:28 crc kubenswrapper[4846]: I1122 09:59:28.627829 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:59:40 crc kubenswrapper[4846]: I1122 09:59:40.075738 4846 generic.go:334] "Generic (PLEG): container finished" podID="7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" containerID="ea31847df74b0aba282e707255d5ff90fd02f202d5fe2a7ad60f455a1eee4441" exitCode=0 Nov 22 09:59:40 crc kubenswrapper[4846]: I1122 09:59:40.076365 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" event={"ID":"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0","Type":"ContainerDied","Data":"ea31847df74b0aba282e707255d5ff90fd02f202d5fe2a7ad60f455a1eee4441"} Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.510134 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581078 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-0\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581158 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-inventory\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581249 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-1\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581348 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ssh-key\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581392 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-telemetry-combined-ca-bundle\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581434 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh5m9\" (UniqueName: \"kubernetes.io/projected/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-kube-api-access-bh5m9\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.581551 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-2\") pod \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\" (UID: \"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0\") " Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.587309 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-kube-api-access-bh5m9" (OuterVolumeSpecName: "kube-api-access-bh5m9") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "kube-api-access-bh5m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.588803 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.612225 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-inventory" (OuterVolumeSpecName: "inventory") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.618523 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.624164 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.636306 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.640288 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" (UID: "7fa86a5b-2dbc-4e12-bf49-ea58d02854b0"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683776 4846 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683812 4846 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683826 4846 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683839 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683856 4846 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683868 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh5m9\" (UniqueName: \"kubernetes.io/projected/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-kube-api-access-bh5m9\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:41 crc kubenswrapper[4846]: I1122 09:59:41.683879 4846 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7fa86a5b-2dbc-4e12-bf49-ea58d02854b0-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:42 crc kubenswrapper[4846]: I1122 09:59:42.101976 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" event={"ID":"7fa86a5b-2dbc-4e12-bf49-ea58d02854b0","Type":"ContainerDied","Data":"446e78d1d39d880bc5d6a9af240df8ae7442dfc8ef206749652263ffb4fba6eb"} Nov 22 09:59:42 crc kubenswrapper[4846]: I1122 09:59:42.102039 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm" Nov 22 09:59:42 crc kubenswrapper[4846]: I1122 09:59:42.102078 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="446e78d1d39d880bc5d6a9af240df8ae7442dfc8ef206749652263ffb4fba6eb" Nov 22 09:59:58 crc kubenswrapper[4846]: I1122 09:59:58.625676 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:59:58 crc kubenswrapper[4846]: I1122 09:59:58.626416 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.151219 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km"] Nov 22 10:00:00 crc kubenswrapper[4846]: E1122 10:00:00.151868 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.151881 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 22 10:00:00 crc kubenswrapper[4846]: E1122 10:00:00.151896 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="extract-content" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.151902 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="extract-content" Nov 22 10:00:00 crc kubenswrapper[4846]: E1122 10:00:00.151912 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="extract-utilities" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.151918 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="extract-utilities" Nov 22 10:00:00 crc kubenswrapper[4846]: E1122 10:00:00.151925 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="registry-server" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.151930 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="registry-server" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.152136 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="28dfe5cb-979c-441e-917b-8d7763919cee" containerName="registry-server" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.152153 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa86a5b-2dbc-4e12-bf49-ea58d02854b0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.152786 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.156581 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.156832 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.186091 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km"] Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.208517 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a806eebb-f551-49f4-bfa9-fdabaaeacb14-secret-volume\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.208946 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmnp\" (UniqueName: \"kubernetes.io/projected/a806eebb-f551-49f4-bfa9-fdabaaeacb14-kube-api-access-dnmnp\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.209280 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a806eebb-f551-49f4-bfa9-fdabaaeacb14-config-volume\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.311503 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a806eebb-f551-49f4-bfa9-fdabaaeacb14-secret-volume\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.311618 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmnp\" (UniqueName: \"kubernetes.io/projected/a806eebb-f551-49f4-bfa9-fdabaaeacb14-kube-api-access-dnmnp\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.311690 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a806eebb-f551-49f4-bfa9-fdabaaeacb14-config-volume\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.312749 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a806eebb-f551-49f4-bfa9-fdabaaeacb14-config-volume\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.326976 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a806eebb-f551-49f4-bfa9-fdabaaeacb14-secret-volume\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.330246 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmnp\" (UniqueName: \"kubernetes.io/projected/a806eebb-f551-49f4-bfa9-fdabaaeacb14-kube-api-access-dnmnp\") pod \"collect-profiles-29396760-9q4km\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.480362 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:00 crc kubenswrapper[4846]: I1122 10:00:00.939634 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km"] Nov 22 10:00:01 crc kubenswrapper[4846]: I1122 10:00:01.295212 4846 generic.go:334] "Generic (PLEG): container finished" podID="a806eebb-f551-49f4-bfa9-fdabaaeacb14" containerID="835aec9c7278d9604263fa7d47eb406e140f648de5960451dfe5eef0ed579c06" exitCode=0 Nov 22 10:00:01 crc kubenswrapper[4846]: I1122 10:00:01.295284 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" event={"ID":"a806eebb-f551-49f4-bfa9-fdabaaeacb14","Type":"ContainerDied","Data":"835aec9c7278d9604263fa7d47eb406e140f648de5960451dfe5eef0ed579c06"} Nov 22 10:00:01 crc kubenswrapper[4846]: I1122 10:00:01.295463 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" event={"ID":"a806eebb-f551-49f4-bfa9-fdabaaeacb14","Type":"ContainerStarted","Data":"90d573be4a9ba8a06e9a3658f092af03ab6dc632f4591a00336194c925b32f7e"} Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.630420 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.752787 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a806eebb-f551-49f4-bfa9-fdabaaeacb14-secret-volume\") pod \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.752981 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmnp\" (UniqueName: \"kubernetes.io/projected/a806eebb-f551-49f4-bfa9-fdabaaeacb14-kube-api-access-dnmnp\") pod \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.753025 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a806eebb-f551-49f4-bfa9-fdabaaeacb14-config-volume\") pod \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\" (UID: \"a806eebb-f551-49f4-bfa9-fdabaaeacb14\") " Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.753694 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a806eebb-f551-49f4-bfa9-fdabaaeacb14-config-volume" (OuterVolumeSpecName: "config-volume") pod "a806eebb-f551-49f4-bfa9-fdabaaeacb14" (UID: "a806eebb-f551-49f4-bfa9-fdabaaeacb14"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.758267 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a806eebb-f551-49f4-bfa9-fdabaaeacb14-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a806eebb-f551-49f4-bfa9-fdabaaeacb14" (UID: "a806eebb-f551-49f4-bfa9-fdabaaeacb14"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.758278 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a806eebb-f551-49f4-bfa9-fdabaaeacb14-kube-api-access-dnmnp" (OuterVolumeSpecName: "kube-api-access-dnmnp") pod "a806eebb-f551-49f4-bfa9-fdabaaeacb14" (UID: "a806eebb-f551-49f4-bfa9-fdabaaeacb14"). InnerVolumeSpecName "kube-api-access-dnmnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.855635 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmnp\" (UniqueName: \"kubernetes.io/projected/a806eebb-f551-49f4-bfa9-fdabaaeacb14-kube-api-access-dnmnp\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.855682 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a806eebb-f551-49f4-bfa9-fdabaaeacb14-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:02 crc kubenswrapper[4846]: I1122 10:00:02.855699 4846 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a806eebb-f551-49f4-bfa9-fdabaaeacb14-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4846]: I1122 10:00:03.321525 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" event={"ID":"a806eebb-f551-49f4-bfa9-fdabaaeacb14","Type":"ContainerDied","Data":"90d573be4a9ba8a06e9a3658f092af03ab6dc632f4591a00336194c925b32f7e"} Nov 22 10:00:03 crc kubenswrapper[4846]: I1122 10:00:03.321842 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d573be4a9ba8a06e9a3658f092af03ab6dc632f4591a00336194c925b32f7e" Nov 22 10:00:03 crc kubenswrapper[4846]: I1122 10:00:03.321645 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-9q4km" Nov 22 10:00:03 crc kubenswrapper[4846]: I1122 10:00:03.717848 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4"] Nov 22 10:00:03 crc kubenswrapper[4846]: I1122 10:00:03.734022 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-nmwg4"] Nov 22 10:00:04 crc kubenswrapper[4846]: I1122 10:00:04.068576 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a54ed2d-f7cd-440a-86c3-4c82ce070ac0" path="/var/lib/kubelet/pods/8a54ed2d-f7cd-440a-86c3-4c82ce070ac0/volumes" Nov 22 10:00:15 crc kubenswrapper[4846]: I1122 10:00:15.887949 4846 scope.go:117] "RemoveContainer" containerID="898b8837d050f320f57d4e5faef53c05c2bd8a6a9e7ea86754c7389f5be690a4" Nov 22 10:00:28 crc kubenswrapper[4846]: I1122 10:00:28.626148 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:00:28 crc kubenswrapper[4846]: I1122 10:00:28.626768 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:00:28 crc kubenswrapper[4846]: I1122 10:00:28.626831 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 10:00:28 crc kubenswrapper[4846]: I1122 10:00:28.627661 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce3d9c51232ef494dd6d2c6997940a02b57399e9a6de1afb40010fd82a108cf3"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:00:28 crc kubenswrapper[4846]: I1122 10:00:28.627762 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://ce3d9c51232ef494dd6d2c6997940a02b57399e9a6de1afb40010fd82a108cf3" gracePeriod=600 Nov 22 10:00:29 crc kubenswrapper[4846]: I1122 10:00:29.635203 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="ce3d9c51232ef494dd6d2c6997940a02b57399e9a6de1afb40010fd82a108cf3" exitCode=0 Nov 22 10:00:29 crc kubenswrapper[4846]: I1122 10:00:29.635259 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"ce3d9c51232ef494dd6d2c6997940a02b57399e9a6de1afb40010fd82a108cf3"} Nov 22 10:00:29 crc kubenswrapper[4846]: I1122 10:00:29.638239 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e"} Nov 22 10:00:29 crc kubenswrapper[4846]: I1122 10:00:29.638272 4846 scope.go:117] "RemoveContainer" containerID="9128697111e7b558023d0001649125562555740a4e57a0309cb8901032f9df1e" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.687649 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b8psd"] Nov 22 10:00:37 crc kubenswrapper[4846]: E1122 10:00:37.688497 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a806eebb-f551-49f4-bfa9-fdabaaeacb14" containerName="collect-profiles" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.688510 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="a806eebb-f551-49f4-bfa9-fdabaaeacb14" containerName="collect-profiles" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.688685 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="a806eebb-f551-49f4-bfa9-fdabaaeacb14" containerName="collect-profiles" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.690213 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.715414 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8psd"] Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.811836 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-catalog-content\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.811875 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-utilities\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.811923 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwfkc\" (UniqueName: \"kubernetes.io/projected/c347872f-8e38-4877-b640-084ed00adf4c-kube-api-access-rwfkc\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.914260 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-catalog-content\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.914319 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-utilities\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.914367 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwfkc\" (UniqueName: \"kubernetes.io/projected/c347872f-8e38-4877-b640-084ed00adf4c-kube-api-access-rwfkc\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.915011 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-utilities\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.915188 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-catalog-content\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:37 crc kubenswrapper[4846]: I1122 10:00:37.934226 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwfkc\" (UniqueName: \"kubernetes.io/projected/c347872f-8e38-4877-b640-084ed00adf4c-kube-api-access-rwfkc\") pod \"redhat-marketplace-b8psd\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:38 crc kubenswrapper[4846]: I1122 10:00:38.011729 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:38 crc kubenswrapper[4846]: I1122 10:00:38.491779 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8psd"] Nov 22 10:00:38 crc kubenswrapper[4846]: I1122 10:00:38.736357 4846 generic.go:334] "Generic (PLEG): container finished" podID="c347872f-8e38-4877-b640-084ed00adf4c" containerID="8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9" exitCode=0 Nov 22 10:00:38 crc kubenswrapper[4846]: I1122 10:00:38.736405 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8psd" event={"ID":"c347872f-8e38-4877-b640-084ed00adf4c","Type":"ContainerDied","Data":"8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9"} Nov 22 10:00:38 crc kubenswrapper[4846]: I1122 10:00:38.736454 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8psd" event={"ID":"c347872f-8e38-4877-b640-084ed00adf4c","Type":"ContainerStarted","Data":"ec98a8b9f7ea54a93ec35182320b2635d3a9d2a82a599d2b6b97161c43155eb3"} Nov 22 10:00:38 crc kubenswrapper[4846]: I1122 10:00:38.738016 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:00:39 crc kubenswrapper[4846]: I1122 10:00:39.747106 4846 generic.go:334] "Generic (PLEG): container finished" podID="c347872f-8e38-4877-b640-084ed00adf4c" containerID="1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279" exitCode=0 Nov 22 10:00:39 crc kubenswrapper[4846]: I1122 10:00:39.747216 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8psd" event={"ID":"c347872f-8e38-4877-b640-084ed00adf4c","Type":"ContainerDied","Data":"1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279"} Nov 22 10:00:40 crc kubenswrapper[4846]: I1122 10:00:40.760911 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8psd" event={"ID":"c347872f-8e38-4877-b640-084ed00adf4c","Type":"ContainerStarted","Data":"8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc"} Nov 22 10:00:40 crc kubenswrapper[4846]: I1122 10:00:40.782917 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b8psd" podStartSLOduration=2.35693728 podStartE2EDuration="3.782899486s" podCreationTimestamp="2025-11-22 10:00:37 +0000 UTC" firstStartedPulling="2025-11-22 10:00:38.737818021 +0000 UTC m=+2813.673507670" lastFinishedPulling="2025-11-22 10:00:40.163780187 +0000 UTC m=+2815.099469876" observedRunningTime="2025-11-22 10:00:40.780679332 +0000 UTC m=+2815.716368981" watchObservedRunningTime="2025-11-22 10:00:40.782899486 +0000 UTC m=+2815.718589135" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.622969 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.625861 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.629776 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ttc5q" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.629835 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.629774 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.632084 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.639293 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.724798 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.724873 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.724916 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29xdz\" (UniqueName: \"kubernetes.io/projected/0746377b-0ff5-4289-b4b6-1e9c3a166533-kube-api-access-29xdz\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.724936 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.724983 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.725127 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.725258 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.725358 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-config-data\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.725374 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827310 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827369 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827405 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827442 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-config-data\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827460 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827533 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827562 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827588 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29xdz\" (UniqueName: \"kubernetes.io/projected/0746377b-0ff5-4289-b4b6-1e9c3a166533-kube-api-access-29xdz\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827605 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.827758 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.828428 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.829015 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.829152 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.829286 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-config-data\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.837525 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.837938 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.844326 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.850466 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29xdz\" (UniqueName: \"kubernetes.io/projected/0746377b-0ff5-4289-b4b6-1e9c3a166533-kube-api-access-29xdz\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.865268 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " pod="openstack/tempest-tests-tempest" Nov 22 10:00:43 crc kubenswrapper[4846]: I1122 10:00:43.966860 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 10:00:44 crc kubenswrapper[4846]: I1122 10:00:44.494356 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 10:00:44 crc kubenswrapper[4846]: I1122 10:00:44.802995 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0746377b-0ff5-4289-b4b6-1e9c3a166533","Type":"ContainerStarted","Data":"38c1e84ddbd03357e2f916e54cd964bd4f0ec7f2813509552f29cfc461931e1e"} Nov 22 10:00:48 crc kubenswrapper[4846]: I1122 10:00:48.012840 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:48 crc kubenswrapper[4846]: I1122 10:00:48.013363 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:48 crc kubenswrapper[4846]: I1122 10:00:48.095082 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:48 crc kubenswrapper[4846]: I1122 10:00:48.901777 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:48 crc kubenswrapper[4846]: I1122 10:00:48.955984 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8psd"] Nov 22 10:00:50 crc kubenswrapper[4846]: I1122 10:00:50.869037 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b8psd" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="registry-server" containerID="cri-o://8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc" gracePeriod=2 Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.344954 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.400353 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-utilities\") pod \"c347872f-8e38-4877-b640-084ed00adf4c\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.400414 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-catalog-content\") pod \"c347872f-8e38-4877-b640-084ed00adf4c\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.400514 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwfkc\" (UniqueName: \"kubernetes.io/projected/c347872f-8e38-4877-b640-084ed00adf4c-kube-api-access-rwfkc\") pod \"c347872f-8e38-4877-b640-084ed00adf4c\" (UID: \"c347872f-8e38-4877-b640-084ed00adf4c\") " Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.401612 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-utilities" (OuterVolumeSpecName: "utilities") pod "c347872f-8e38-4877-b640-084ed00adf4c" (UID: "c347872f-8e38-4877-b640-084ed00adf4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.408711 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c347872f-8e38-4877-b640-084ed00adf4c-kube-api-access-rwfkc" (OuterVolumeSpecName: "kube-api-access-rwfkc") pod "c347872f-8e38-4877-b640-084ed00adf4c" (UID: "c347872f-8e38-4877-b640-084ed00adf4c"). InnerVolumeSpecName "kube-api-access-rwfkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.427931 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c347872f-8e38-4877-b640-084ed00adf4c" (UID: "c347872f-8e38-4877-b640-084ed00adf4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.503012 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.503080 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c347872f-8e38-4877-b640-084ed00adf4c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.503093 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwfkc\" (UniqueName: \"kubernetes.io/projected/c347872f-8e38-4877-b640-084ed00adf4c-kube-api-access-rwfkc\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.884337 4846 generic.go:334] "Generic (PLEG): container finished" podID="c347872f-8e38-4877-b640-084ed00adf4c" containerID="8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc" exitCode=0 Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.884693 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8psd" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.884971 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8psd" event={"ID":"c347872f-8e38-4877-b640-084ed00adf4c","Type":"ContainerDied","Data":"8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc"} Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.885003 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8psd" event={"ID":"c347872f-8e38-4877-b640-084ed00adf4c","Type":"ContainerDied","Data":"ec98a8b9f7ea54a93ec35182320b2635d3a9d2a82a599d2b6b97161c43155eb3"} Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.885018 4846 scope.go:117] "RemoveContainer" containerID="8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.921548 4846 scope.go:117] "RemoveContainer" containerID="1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.926375 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8psd"] Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.935367 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8psd"] Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.955913 4846 scope.go:117] "RemoveContainer" containerID="8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.994970 4846 scope.go:117] "RemoveContainer" containerID="8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc" Nov 22 10:00:51 crc kubenswrapper[4846]: E1122 10:00:51.995621 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc\": container with ID starting with 8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc not found: ID does not exist" containerID="8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.995676 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc"} err="failed to get container status \"8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc\": rpc error: code = NotFound desc = could not find container \"8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc\": container with ID starting with 8356c494a67af531031a48043d15901a44f48bf9b8189a86770a9ab1222839dc not found: ID does not exist" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.995711 4846 scope.go:117] "RemoveContainer" containerID="1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279" Nov 22 10:00:51 crc kubenswrapper[4846]: E1122 10:00:51.996321 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279\": container with ID starting with 1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279 not found: ID does not exist" containerID="1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.996373 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279"} err="failed to get container status \"1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279\": rpc error: code = NotFound desc = could not find container \"1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279\": container with ID starting with 1bdb5eef6ea8bbcf3b1b62842b4d7681ef04756655e1e4647f2f7f1ef9657279 not found: ID does not exist" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.996402 4846 scope.go:117] "RemoveContainer" containerID="8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9" Nov 22 10:00:51 crc kubenswrapper[4846]: E1122 10:00:51.996845 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9\": container with ID starting with 8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9 not found: ID does not exist" containerID="8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9" Nov 22 10:00:51 crc kubenswrapper[4846]: I1122 10:00:51.996879 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9"} err="failed to get container status \"8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9\": rpc error: code = NotFound desc = could not find container \"8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9\": container with ID starting with 8dcb0b81b9dfc2386ba7785ec0009ce7dc12dfd4fd3a6951815df4831b2ea2c9 not found: ID does not exist" Nov 22 10:00:52 crc kubenswrapper[4846]: I1122 10:00:52.046824 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c347872f-8e38-4877-b640-084ed00adf4c" path="/var/lib/kubelet/pods/c347872f-8e38-4877-b640-084ed00adf4c/volumes" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.144417 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29396761-j28rw"] Nov 22 10:01:00 crc kubenswrapper[4846]: E1122 10:01:00.146156 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="extract-content" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.146184 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="extract-content" Nov 22 10:01:00 crc kubenswrapper[4846]: E1122 10:01:00.146221 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="registry-server" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.146232 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="registry-server" Nov 22 10:01:00 crc kubenswrapper[4846]: E1122 10:01:00.146256 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="extract-utilities" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.146269 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="extract-utilities" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.146523 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="c347872f-8e38-4877-b640-084ed00adf4c" containerName="registry-server" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.147608 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.156120 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396761-j28rw"] Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.289184 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfxc\" (UniqueName: \"kubernetes.io/projected/06e59565-2673-4e50-a150-a4f336c8dbfe-kube-api-access-kcfxc\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.289564 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-config-data\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.289714 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-combined-ca-bundle\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.289821 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-fernet-keys\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.391483 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfxc\" (UniqueName: \"kubernetes.io/projected/06e59565-2673-4e50-a150-a4f336c8dbfe-kube-api-access-kcfxc\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.391578 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-config-data\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.391626 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-combined-ca-bundle\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.391648 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-fernet-keys\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.398237 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-fernet-keys\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.398449 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-config-data\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.401298 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-combined-ca-bundle\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.408367 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfxc\" (UniqueName: \"kubernetes.io/projected/06e59565-2673-4e50-a150-a4f336c8dbfe-kube-api-access-kcfxc\") pod \"keystone-cron-29396761-j28rw\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:00 crc kubenswrapper[4846]: I1122 10:01:00.487894 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:26 crc kubenswrapper[4846]: E1122 10:01:26.348983 4846 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 22 10:01:26 crc kubenswrapper[4846]: E1122 10:01:26.350128 4846 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-29xdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(0746377b-0ff5-4289-b4b6-1e9c3a166533): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 10:01:26 crc kubenswrapper[4846]: E1122 10:01:26.352312 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="0746377b-0ff5-4289-b4b6-1e9c3a166533" Nov 22 10:01:26 crc kubenswrapper[4846]: W1122 10:01:26.781586 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e59565_2673_4e50_a150_a4f336c8dbfe.slice/crio-7be9e89da764628ca4217d895ce334fd0f098dc9f0b79a0784f31ba450a9d39f WatchSource:0}: Error finding container 7be9e89da764628ca4217d895ce334fd0f098dc9f0b79a0784f31ba450a9d39f: Status 404 returned error can't find the container with id 7be9e89da764628ca4217d895ce334fd0f098dc9f0b79a0784f31ba450a9d39f Nov 22 10:01:26 crc kubenswrapper[4846]: I1122 10:01:26.783516 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396761-j28rw"] Nov 22 10:01:27 crc kubenswrapper[4846]: I1122 10:01:27.283533 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-j28rw" event={"ID":"06e59565-2673-4e50-a150-a4f336c8dbfe","Type":"ContainerStarted","Data":"75f8810560308fba8c51bae401b3a6893d4a79d515718c9b22bbc7ff0dfad9ba"} Nov 22 10:01:27 crc kubenswrapper[4846]: I1122 10:01:27.283970 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-j28rw" event={"ID":"06e59565-2673-4e50-a150-a4f336c8dbfe","Type":"ContainerStarted","Data":"7be9e89da764628ca4217d895ce334fd0f098dc9f0b79a0784f31ba450a9d39f"} Nov 22 10:01:27 crc kubenswrapper[4846]: E1122 10:01:27.285998 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="0746377b-0ff5-4289-b4b6-1e9c3a166533" Nov 22 10:01:27 crc kubenswrapper[4846]: I1122 10:01:27.331953 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29396761-j28rw" podStartSLOduration=27.33193222 podStartE2EDuration="27.33193222s" podCreationTimestamp="2025-11-22 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:01:27.329727047 +0000 UTC m=+2862.265416706" watchObservedRunningTime="2025-11-22 10:01:27.33193222 +0000 UTC m=+2862.267621869" Nov 22 10:01:29 crc kubenswrapper[4846]: I1122 10:01:29.305563 4846 generic.go:334] "Generic (PLEG): container finished" podID="06e59565-2673-4e50-a150-a4f336c8dbfe" containerID="75f8810560308fba8c51bae401b3a6893d4a79d515718c9b22bbc7ff0dfad9ba" exitCode=0 Nov 22 10:01:29 crc kubenswrapper[4846]: I1122 10:01:29.305669 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-j28rw" event={"ID":"06e59565-2673-4e50-a150-a4f336c8dbfe","Type":"ContainerDied","Data":"75f8810560308fba8c51bae401b3a6893d4a79d515718c9b22bbc7ff0dfad9ba"} Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.704771 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.852603 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-config-data\") pod \"06e59565-2673-4e50-a150-a4f336c8dbfe\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.852662 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfxc\" (UniqueName: \"kubernetes.io/projected/06e59565-2673-4e50-a150-a4f336c8dbfe-kube-api-access-kcfxc\") pod \"06e59565-2673-4e50-a150-a4f336c8dbfe\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.852702 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-combined-ca-bundle\") pod \"06e59565-2673-4e50-a150-a4f336c8dbfe\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.852847 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-fernet-keys\") pod \"06e59565-2673-4e50-a150-a4f336c8dbfe\" (UID: \"06e59565-2673-4e50-a150-a4f336c8dbfe\") " Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.860166 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e59565-2673-4e50-a150-a4f336c8dbfe-kube-api-access-kcfxc" (OuterVolumeSpecName: "kube-api-access-kcfxc") pod "06e59565-2673-4e50-a150-a4f336c8dbfe" (UID: "06e59565-2673-4e50-a150-a4f336c8dbfe"). InnerVolumeSpecName "kube-api-access-kcfxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.867244 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "06e59565-2673-4e50-a150-a4f336c8dbfe" (UID: "06e59565-2673-4e50-a150-a4f336c8dbfe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.891151 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06e59565-2673-4e50-a150-a4f336c8dbfe" (UID: "06e59565-2673-4e50-a150-a4f336c8dbfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.917646 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-config-data" (OuterVolumeSpecName: "config-data") pod "06e59565-2673-4e50-a150-a4f336c8dbfe" (UID: "06e59565-2673-4e50-a150-a4f336c8dbfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.955284 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.955331 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcfxc\" (UniqueName: \"kubernetes.io/projected/06e59565-2673-4e50-a150-a4f336c8dbfe-kube-api-access-kcfxc\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.955345 4846 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:30 crc kubenswrapper[4846]: I1122 10:01:30.955358 4846 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06e59565-2673-4e50-a150-a4f336c8dbfe-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:31 crc kubenswrapper[4846]: I1122 10:01:31.358090 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-j28rw" event={"ID":"06e59565-2673-4e50-a150-a4f336c8dbfe","Type":"ContainerDied","Data":"7be9e89da764628ca4217d895ce334fd0f098dc9f0b79a0784f31ba450a9d39f"} Nov 22 10:01:31 crc kubenswrapper[4846]: I1122 10:01:31.358134 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7be9e89da764628ca4217d895ce334fd0f098dc9f0b79a0784f31ba450a9d39f" Nov 22 10:01:31 crc kubenswrapper[4846]: I1122 10:01:31.358132 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-j28rw" Nov 22 10:01:41 crc kubenswrapper[4846]: I1122 10:01:41.548248 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 22 10:01:43 crc kubenswrapper[4846]: I1122 10:01:43.503410 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0746377b-0ff5-4289-b4b6-1e9c3a166533","Type":"ContainerStarted","Data":"1b00de3d4c7bcaba4ef69cb16114114bd776cba7e9962ad5d60019e66ca5eb52"} Nov 22 10:01:43 crc kubenswrapper[4846]: I1122 10:01:43.540633 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.506881882 podStartE2EDuration="1m1.540603907s" podCreationTimestamp="2025-11-22 10:00:42 +0000 UTC" firstStartedPulling="2025-11-22 10:00:44.510857686 +0000 UTC m=+2819.446547335" lastFinishedPulling="2025-11-22 10:01:41.544579701 +0000 UTC m=+2876.480269360" observedRunningTime="2025-11-22 10:01:43.532421913 +0000 UTC m=+2878.468111602" watchObservedRunningTime="2025-11-22 10:01:43.540603907 +0000 UTC m=+2878.476293586" Nov 22 10:02:26 crc kubenswrapper[4846]: I1122 10:02:26.316967 4846 scope.go:117] "RemoveContainer" containerID="55fdfeca00ca6be394c78e20ba1e1756111e1bbc010804f4c0ea52fd5755d114" Nov 22 10:02:26 crc kubenswrapper[4846]: I1122 10:02:26.342802 4846 scope.go:117] "RemoveContainer" containerID="5e6ba253048773ef9ac63f701d5ceea8ca0bbd39b83d4ea973dfcb40dca2ee4c" Nov 22 10:02:26 crc kubenswrapper[4846]: I1122 10:02:26.421909 4846 scope.go:117] "RemoveContainer" containerID="dc7f97f3be98330f37f96ebda6dc33c75f17dfe49f072cdc1c3d7e99623ee415" Nov 22 10:02:26 crc kubenswrapper[4846]: I1122 10:02:26.458099 4846 scope.go:117] "RemoveContainer" containerID="04aef05ebce552e8082be4f69d1b86e359d5f3dd0b41105abed4a5b5053dc7fc" Nov 22 10:02:26 crc kubenswrapper[4846]: I1122 10:02:26.498605 4846 scope.go:117] "RemoveContainer" containerID="f67a50ceaee78adc6cd55f0e3c8955bf43ae47b038a7db000b8675e9a526d9e0" Nov 22 10:02:26 crc kubenswrapper[4846]: I1122 10:02:26.531442 4846 scope.go:117] "RemoveContainer" containerID="786fb8844703e8aa97a6b98cc1f718cd5ba875b54f7c0adac586b1867f71d1ac" Nov 22 10:02:28 crc kubenswrapper[4846]: I1122 10:02:28.625262 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:02:28 crc kubenswrapper[4846]: I1122 10:02:28.625352 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:02:58 crc kubenswrapper[4846]: I1122 10:02:58.625522 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:02:58 crc kubenswrapper[4846]: I1122 10:02:58.626115 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:03:28 crc kubenswrapper[4846]: I1122 10:03:28.625882 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:03:28 crc kubenswrapper[4846]: I1122 10:03:28.628516 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:03:28 crc kubenswrapper[4846]: I1122 10:03:28.628742 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 10:03:28 crc kubenswrapper[4846]: I1122 10:03:28.630078 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:03:28 crc kubenswrapper[4846]: I1122 10:03:28.630349 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" gracePeriod=600 Nov 22 10:03:28 crc kubenswrapper[4846]: E1122 10:03:28.995471 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:03:29 crc kubenswrapper[4846]: I1122 10:03:29.620973 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" exitCode=0 Nov 22 10:03:29 crc kubenswrapper[4846]: I1122 10:03:29.621074 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e"} Nov 22 10:03:29 crc kubenswrapper[4846]: I1122 10:03:29.621591 4846 scope.go:117] "RemoveContainer" containerID="ce3d9c51232ef494dd6d2c6997940a02b57399e9a6de1afb40010fd82a108cf3" Nov 22 10:03:29 crc kubenswrapper[4846]: I1122 10:03:29.622317 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:03:29 crc kubenswrapper[4846]: E1122 10:03:29.622816 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:03:44 crc kubenswrapper[4846]: I1122 10:03:44.035090 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:03:44 crc kubenswrapper[4846]: E1122 10:03:44.035994 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:03:58 crc kubenswrapper[4846]: I1122 10:03:58.036176 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:03:58 crc kubenswrapper[4846]: E1122 10:03:58.037119 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:04:13 crc kubenswrapper[4846]: I1122 10:04:13.035896 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:04:13 crc kubenswrapper[4846]: E1122 10:04:13.036705 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:04:25 crc kubenswrapper[4846]: I1122 10:04:25.036625 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:04:25 crc kubenswrapper[4846]: E1122 10:04:25.037560 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:04:37 crc kubenswrapper[4846]: I1122 10:04:37.035709 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:04:37 crc kubenswrapper[4846]: E1122 10:04:37.037584 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:04:50 crc kubenswrapper[4846]: I1122 10:04:50.035890 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:04:50 crc kubenswrapper[4846]: E1122 10:04:50.037162 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:05:01 crc kubenswrapper[4846]: I1122 10:05:01.036406 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:05:01 crc kubenswrapper[4846]: E1122 10:05:01.039283 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:05:16 crc kubenswrapper[4846]: I1122 10:05:16.046024 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:05:16 crc kubenswrapper[4846]: E1122 10:05:16.047086 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:05:27 crc kubenswrapper[4846]: I1122 10:05:27.036369 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:05:27 crc kubenswrapper[4846]: E1122 10:05:27.037438 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:05:39 crc kubenswrapper[4846]: I1122 10:05:39.035697 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:05:39 crc kubenswrapper[4846]: E1122 10:05:39.039034 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:05:51 crc kubenswrapper[4846]: I1122 10:05:51.035456 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:05:51 crc kubenswrapper[4846]: E1122 10:05:51.036361 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.055199 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rjrs5"] Nov 22 10:06:05 crc kubenswrapper[4846]: E1122 10:06:05.056528 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e59565-2673-4e50-a150-a4f336c8dbfe" containerName="keystone-cron" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.056548 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e59565-2673-4e50-a150-a4f336c8dbfe" containerName="keystone-cron" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.056782 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e59565-2673-4e50-a150-a4f336c8dbfe" containerName="keystone-cron" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.058745 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.079238 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjrs5"] Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.167611 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpdj\" (UniqueName: \"kubernetes.io/projected/8889408f-2372-4402-b151-27a43c75c4c5-kube-api-access-dfpdj\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.167759 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-catalog-content\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.168469 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-utilities\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.270499 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-utilities\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.270667 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpdj\" (UniqueName: \"kubernetes.io/projected/8889408f-2372-4402-b151-27a43c75c4c5-kube-api-access-dfpdj\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.270705 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-catalog-content\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.271089 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-utilities\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.271439 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-catalog-content\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.305147 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpdj\" (UniqueName: \"kubernetes.io/projected/8889408f-2372-4402-b151-27a43c75c4c5-kube-api-access-dfpdj\") pod \"certified-operators-rjrs5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.394758 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:05 crc kubenswrapper[4846]: I1122 10:06:05.946318 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjrs5"] Nov 22 10:06:06 crc kubenswrapper[4846]: I1122 10:06:06.037418 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:06:06 crc kubenswrapper[4846]: E1122 10:06:06.037949 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:06:06 crc kubenswrapper[4846]: I1122 10:06:06.362285 4846 generic.go:334] "Generic (PLEG): container finished" podID="8889408f-2372-4402-b151-27a43c75c4c5" containerID="d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94" exitCode=0 Nov 22 10:06:06 crc kubenswrapper[4846]: I1122 10:06:06.362398 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjrs5" event={"ID":"8889408f-2372-4402-b151-27a43c75c4c5","Type":"ContainerDied","Data":"d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94"} Nov 22 10:06:06 crc kubenswrapper[4846]: I1122 10:06:06.362642 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjrs5" event={"ID":"8889408f-2372-4402-b151-27a43c75c4c5","Type":"ContainerStarted","Data":"5d77f0bd1a1d1cf7bb996af8844bcdc08f2d367486a11b8c9530268fd53a1b63"} Nov 22 10:06:06 crc kubenswrapper[4846]: I1122 10:06:06.364958 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:06:07 crc kubenswrapper[4846]: E1122 10:06:07.880013 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8889408f_2372_4402_b151_27a43c75c4c5.slice/crio-conmon-c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:06:08 crc kubenswrapper[4846]: I1122 10:06:08.389275 4846 generic.go:334] "Generic (PLEG): container finished" podID="8889408f-2372-4402-b151-27a43c75c4c5" containerID="c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c" exitCode=0 Nov 22 10:06:08 crc kubenswrapper[4846]: I1122 10:06:08.389355 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjrs5" event={"ID":"8889408f-2372-4402-b151-27a43c75c4c5","Type":"ContainerDied","Data":"c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c"} Nov 22 10:06:09 crc kubenswrapper[4846]: I1122 10:06:09.413173 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjrs5" event={"ID":"8889408f-2372-4402-b151-27a43c75c4c5","Type":"ContainerStarted","Data":"1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1"} Nov 22 10:06:15 crc kubenswrapper[4846]: I1122 10:06:15.396074 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:15 crc kubenswrapper[4846]: I1122 10:06:15.396596 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:15 crc kubenswrapper[4846]: I1122 10:06:15.444393 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:15 crc kubenswrapper[4846]: I1122 10:06:15.465924 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rjrs5" podStartSLOduration=8.023224153 podStartE2EDuration="10.465901539s" podCreationTimestamp="2025-11-22 10:06:05 +0000 UTC" firstStartedPulling="2025-11-22 10:06:06.364707745 +0000 UTC m=+3141.300397394" lastFinishedPulling="2025-11-22 10:06:08.807385091 +0000 UTC m=+3143.743074780" observedRunningTime="2025-11-22 10:06:09.435846781 +0000 UTC m=+3144.371536430" watchObservedRunningTime="2025-11-22 10:06:15.465901539 +0000 UTC m=+3150.401591188" Nov 22 10:06:15 crc kubenswrapper[4846]: I1122 10:06:15.521920 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:15 crc kubenswrapper[4846]: I1122 10:06:15.679632 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjrs5"] Nov 22 10:06:17 crc kubenswrapper[4846]: I1122 10:06:17.490194 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rjrs5" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="registry-server" containerID="cri-o://1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1" gracePeriod=2 Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.013639 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.062828 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-catalog-content\") pod \"8889408f-2372-4402-b151-27a43c75c4c5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.063132 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-utilities\") pod \"8889408f-2372-4402-b151-27a43c75c4c5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.063224 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfpdj\" (UniqueName: \"kubernetes.io/projected/8889408f-2372-4402-b151-27a43c75c4c5-kube-api-access-dfpdj\") pod \"8889408f-2372-4402-b151-27a43c75c4c5\" (UID: \"8889408f-2372-4402-b151-27a43c75c4c5\") " Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.064668 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-utilities" (OuterVolumeSpecName: "utilities") pod "8889408f-2372-4402-b151-27a43c75c4c5" (UID: "8889408f-2372-4402-b151-27a43c75c4c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.071511 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8889408f-2372-4402-b151-27a43c75c4c5-kube-api-access-dfpdj" (OuterVolumeSpecName: "kube-api-access-dfpdj") pod "8889408f-2372-4402-b151-27a43c75c4c5" (UID: "8889408f-2372-4402-b151-27a43c75c4c5"). InnerVolumeSpecName "kube-api-access-dfpdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.103613 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6zgv"] Nov 22 10:06:18 crc kubenswrapper[4846]: E1122 10:06:18.104445 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="registry-server" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.104486 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="registry-server" Nov 22 10:06:18 crc kubenswrapper[4846]: E1122 10:06:18.104501 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="extract-content" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.104508 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="extract-content" Nov 22 10:06:18 crc kubenswrapper[4846]: E1122 10:06:18.104526 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="extract-utilities" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.104532 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="extract-utilities" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.104908 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="8889408f-2372-4402-b151-27a43c75c4c5" containerName="registry-server" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.106395 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.111469 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6zgv"] Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.165430 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-utilities\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.165487 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-catalog-content\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.165592 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nx8\" (UniqueName: \"kubernetes.io/projected/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-kube-api-access-m4nx8\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.165662 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.165676 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfpdj\" (UniqueName: \"kubernetes.io/projected/8889408f-2372-4402-b151-27a43c75c4c5-kube-api-access-dfpdj\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.191313 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8889408f-2372-4402-b151-27a43c75c4c5" (UID: "8889408f-2372-4402-b151-27a43c75c4c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.266793 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-utilities\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.266845 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-catalog-content\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.266937 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nx8\" (UniqueName: \"kubernetes.io/projected/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-kube-api-access-m4nx8\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.267010 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8889408f-2372-4402-b151-27a43c75c4c5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.267472 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-utilities\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.267497 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-catalog-content\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.307023 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nx8\" (UniqueName: \"kubernetes.io/projected/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-kube-api-access-m4nx8\") pod \"community-operators-j6zgv\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.447682 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.510327 4846 generic.go:334] "Generic (PLEG): container finished" podID="8889408f-2372-4402-b151-27a43c75c4c5" containerID="1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1" exitCode=0 Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.510390 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjrs5" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.510383 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjrs5" event={"ID":"8889408f-2372-4402-b151-27a43c75c4c5","Type":"ContainerDied","Data":"1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1"} Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.510930 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjrs5" event={"ID":"8889408f-2372-4402-b151-27a43c75c4c5","Type":"ContainerDied","Data":"5d77f0bd1a1d1cf7bb996af8844bcdc08f2d367486a11b8c9530268fd53a1b63"} Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.510958 4846 scope.go:117] "RemoveContainer" containerID="1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.564349 4846 scope.go:117] "RemoveContainer" containerID="c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.572450 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjrs5"] Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.581616 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rjrs5"] Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.616332 4846 scope.go:117] "RemoveContainer" containerID="d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.681467 4846 scope.go:117] "RemoveContainer" containerID="1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1" Nov 22 10:06:18 crc kubenswrapper[4846]: E1122 10:06:18.687426 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1\": container with ID starting with 1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1 not found: ID does not exist" containerID="1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.687459 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1"} err="failed to get container status \"1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1\": rpc error: code = NotFound desc = could not find container \"1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1\": container with ID starting with 1e369d9cacd5878c5e284c7203433a738912159e0d649f9253f6d7fb4e08caf1 not found: ID does not exist" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.687481 4846 scope.go:117] "RemoveContainer" containerID="c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c" Nov 22 10:06:18 crc kubenswrapper[4846]: E1122 10:06:18.693421 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c\": container with ID starting with c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c not found: ID does not exist" containerID="c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.693452 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c"} err="failed to get container status \"c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c\": rpc error: code = NotFound desc = could not find container \"c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c\": container with ID starting with c46303bc5c2c99393a1bdff2a9d60f6e9b55ae53c40b73c8a2fb3c118708a74c not found: ID does not exist" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.693470 4846 scope.go:117] "RemoveContainer" containerID="d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94" Nov 22 10:06:18 crc kubenswrapper[4846]: E1122 10:06:18.693703 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94\": container with ID starting with d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94 not found: ID does not exist" containerID="d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94" Nov 22 10:06:18 crc kubenswrapper[4846]: I1122 10:06:18.693726 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94"} err="failed to get container status \"d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94\": rpc error: code = NotFound desc = could not find container \"d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94\": container with ID starting with d22324db091f27cf9a73520bb2675daa2d22a195a57fde77975e21d244d0dd94 not found: ID does not exist" Nov 22 10:06:19 crc kubenswrapper[4846]: I1122 10:06:19.027925 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6zgv"] Nov 22 10:06:19 crc kubenswrapper[4846]: I1122 10:06:19.521137 4846 generic.go:334] "Generic (PLEG): container finished" podID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerID="9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03" exitCode=0 Nov 22 10:06:19 crc kubenswrapper[4846]: I1122 10:06:19.521282 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerDied","Data":"9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03"} Nov 22 10:06:19 crc kubenswrapper[4846]: I1122 10:06:19.521399 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerStarted","Data":"cac5b8b83be3b28d99de2640461cdcb272756ce59d8faf38ba42ecf616dec683"} Nov 22 10:06:20 crc kubenswrapper[4846]: I1122 10:06:20.052622 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8889408f-2372-4402-b151-27a43c75c4c5" path="/var/lib/kubelet/pods/8889408f-2372-4402-b151-27a43c75c4c5/volumes" Nov 22 10:06:20 crc kubenswrapper[4846]: I1122 10:06:20.533794 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerStarted","Data":"e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a"} Nov 22 10:06:21 crc kubenswrapper[4846]: I1122 10:06:21.035141 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:06:21 crc kubenswrapper[4846]: E1122 10:06:21.035413 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:06:21 crc kubenswrapper[4846]: I1122 10:06:21.547281 4846 generic.go:334] "Generic (PLEG): container finished" podID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerID="e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a" exitCode=0 Nov 22 10:06:21 crc kubenswrapper[4846]: I1122 10:06:21.547359 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerDied","Data":"e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a"} Nov 22 10:06:22 crc kubenswrapper[4846]: I1122 10:06:22.569996 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerStarted","Data":"76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8"} Nov 22 10:06:22 crc kubenswrapper[4846]: I1122 10:06:22.590011 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6zgv" podStartSLOduration=2.102459665 podStartE2EDuration="4.58999075s" podCreationTimestamp="2025-11-22 10:06:18 +0000 UTC" firstStartedPulling="2025-11-22 10:06:19.523797246 +0000 UTC m=+3154.459486895" lastFinishedPulling="2025-11-22 10:06:22.011328331 +0000 UTC m=+3156.947017980" observedRunningTime="2025-11-22 10:06:22.588287031 +0000 UTC m=+3157.523976690" watchObservedRunningTime="2025-11-22 10:06:22.58999075 +0000 UTC m=+3157.525680409" Nov 22 10:06:28 crc kubenswrapper[4846]: I1122 10:06:28.449632 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:28 crc kubenswrapper[4846]: I1122 10:06:28.451721 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:28 crc kubenswrapper[4846]: I1122 10:06:28.501712 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:28 crc kubenswrapper[4846]: I1122 10:06:28.676591 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:28 crc kubenswrapper[4846]: I1122 10:06:28.732106 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6zgv"] Nov 22 10:06:30 crc kubenswrapper[4846]: I1122 10:06:30.641935 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6zgv" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="registry-server" containerID="cri-o://76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8" gracePeriod=2 Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.156428 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.228853 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4nx8\" (UniqueName: \"kubernetes.io/projected/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-kube-api-access-m4nx8\") pod \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.228996 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-catalog-content\") pod \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.230134 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-utilities\") pod \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\" (UID: \"0e6fb47e-c4c0-4933-97c3-adaa5b694dae\") " Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.231257 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-utilities" (OuterVolumeSpecName: "utilities") pod "0e6fb47e-c4c0-4933-97c3-adaa5b694dae" (UID: "0e6fb47e-c4c0-4933-97c3-adaa5b694dae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.236038 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-kube-api-access-m4nx8" (OuterVolumeSpecName: "kube-api-access-m4nx8") pod "0e6fb47e-c4c0-4933-97c3-adaa5b694dae" (UID: "0e6fb47e-c4c0-4933-97c3-adaa5b694dae"). InnerVolumeSpecName "kube-api-access-m4nx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.284273 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e6fb47e-c4c0-4933-97c3-adaa5b694dae" (UID: "0e6fb47e-c4c0-4933-97c3-adaa5b694dae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.332775 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.332807 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4nx8\" (UniqueName: \"kubernetes.io/projected/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-kube-api-access-m4nx8\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.332820 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e6fb47e-c4c0-4933-97c3-adaa5b694dae-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.658696 4846 generic.go:334] "Generic (PLEG): container finished" podID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerID="76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8" exitCode=0 Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.658759 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerDied","Data":"76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8"} Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.658802 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6zgv" event={"ID":"0e6fb47e-c4c0-4933-97c3-adaa5b694dae","Type":"ContainerDied","Data":"cac5b8b83be3b28d99de2640461cdcb272756ce59d8faf38ba42ecf616dec683"} Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.658815 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6zgv" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.658847 4846 scope.go:117] "RemoveContainer" containerID="76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.713337 4846 scope.go:117] "RemoveContainer" containerID="e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.713475 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6zgv"] Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.741941 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6zgv"] Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.756704 4846 scope.go:117] "RemoveContainer" containerID="9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.793747 4846 scope.go:117] "RemoveContainer" containerID="76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8" Nov 22 10:06:31 crc kubenswrapper[4846]: E1122 10:06:31.794395 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8\": container with ID starting with 76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8 not found: ID does not exist" containerID="76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.794432 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8"} err="failed to get container status \"76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8\": rpc error: code = NotFound desc = could not find container \"76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8\": container with ID starting with 76e743185931111b28d8e52dbc531e5b543d011bd90d3b69bc02db952cc386e8 not found: ID does not exist" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.794462 4846 scope.go:117] "RemoveContainer" containerID="e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a" Nov 22 10:06:31 crc kubenswrapper[4846]: E1122 10:06:31.796416 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a\": container with ID starting with e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a not found: ID does not exist" containerID="e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.796458 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a"} err="failed to get container status \"e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a\": rpc error: code = NotFound desc = could not find container \"e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a\": container with ID starting with e4ebc96cc0527c961a5abfd1c8f068b0c8407006a1ce81236e475d6081e29d0a not found: ID does not exist" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.796486 4846 scope.go:117] "RemoveContainer" containerID="9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03" Nov 22 10:06:31 crc kubenswrapper[4846]: E1122 10:06:31.798018 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03\": container with ID starting with 9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03 not found: ID does not exist" containerID="9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03" Nov 22 10:06:31 crc kubenswrapper[4846]: I1122 10:06:31.798064 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03"} err="failed to get container status \"9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03\": rpc error: code = NotFound desc = could not find container \"9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03\": container with ID starting with 9f72db53a4636d456809a17631d0a7a8046c3d9fc1459936aba4f7a4c0004f03 not found: ID does not exist" Nov 22 10:06:32 crc kubenswrapper[4846]: I1122 10:06:32.036229 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:06:32 crc kubenswrapper[4846]: E1122 10:06:32.036950 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:06:32 crc kubenswrapper[4846]: I1122 10:06:32.049676 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" path="/var/lib/kubelet/pods/0e6fb47e-c4c0-4933-97c3-adaa5b694dae/volumes" Nov 22 10:06:45 crc kubenswrapper[4846]: I1122 10:06:45.036335 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:06:45 crc kubenswrapper[4846]: E1122 10:06:45.037280 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:06:56 crc kubenswrapper[4846]: I1122 10:06:56.052410 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:06:56 crc kubenswrapper[4846]: E1122 10:06:56.054828 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:07:10 crc kubenswrapper[4846]: I1122 10:07:10.035264 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:07:10 crc kubenswrapper[4846]: E1122 10:07:10.036189 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:07:23 crc kubenswrapper[4846]: I1122 10:07:23.035550 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:07:23 crc kubenswrapper[4846]: E1122 10:07:23.036338 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:07:38 crc kubenswrapper[4846]: I1122 10:07:38.036225 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:07:38 crc kubenswrapper[4846]: E1122 10:07:38.037588 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:07:50 crc kubenswrapper[4846]: I1122 10:07:50.035025 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:07:50 crc kubenswrapper[4846]: E1122 10:07:50.035876 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:08:01 crc kubenswrapper[4846]: I1122 10:08:01.035227 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:08:01 crc kubenswrapper[4846]: E1122 10:08:01.036161 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:08:16 crc kubenswrapper[4846]: I1122 10:08:16.054867 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:08:16 crc kubenswrapper[4846]: E1122 10:08:16.059170 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.000124 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4cx8"] Nov 22 10:08:22 crc kubenswrapper[4846]: E1122 10:08:22.001324 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="extract-utilities" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.001347 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="extract-utilities" Nov 22 10:08:22 crc kubenswrapper[4846]: E1122 10:08:22.001401 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="registry-server" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.001413 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="registry-server" Nov 22 10:08:22 crc kubenswrapper[4846]: E1122 10:08:22.001447 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="extract-content" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.001458 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="extract-content" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.001800 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6fb47e-c4c0-4933-97c3-adaa5b694dae" containerName="registry-server" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.004010 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.026715 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4cx8"] Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.099400 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbvk\" (UniqueName: \"kubernetes.io/projected/36aa0016-1f8e-4537-93bd-dd8bdec50954-kube-api-access-pbbvk\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.099549 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-catalog-content\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.099776 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-utilities\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.201372 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-catalog-content\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.201573 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-utilities\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.201683 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbvk\" (UniqueName: \"kubernetes.io/projected/36aa0016-1f8e-4537-93bd-dd8bdec50954-kube-api-access-pbbvk\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.202319 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-catalog-content\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.202573 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-utilities\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.241094 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbvk\" (UniqueName: \"kubernetes.io/projected/36aa0016-1f8e-4537-93bd-dd8bdec50954-kube-api-access-pbbvk\") pod \"redhat-operators-n4cx8\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.348199 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:22 crc kubenswrapper[4846]: I1122 10:08:22.806442 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4cx8"] Nov 22 10:08:22 crc kubenswrapper[4846]: W1122 10:08:22.812335 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36aa0016_1f8e_4537_93bd_dd8bdec50954.slice/crio-eed05fe2b32fbbe911439d49298bb435711f9c5e0a947a98c0e7252c8b47a59c WatchSource:0}: Error finding container eed05fe2b32fbbe911439d49298bb435711f9c5e0a947a98c0e7252c8b47a59c: Status 404 returned error can't find the container with id eed05fe2b32fbbe911439d49298bb435711f9c5e0a947a98c0e7252c8b47a59c Nov 22 10:08:23 crc kubenswrapper[4846]: I1122 10:08:23.008387 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerStarted","Data":"eed05fe2b32fbbe911439d49298bb435711f9c5e0a947a98c0e7252c8b47a59c"} Nov 22 10:08:24 crc kubenswrapper[4846]: I1122 10:08:24.026004 4846 generic.go:334] "Generic (PLEG): container finished" podID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerID="7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2" exitCode=0 Nov 22 10:08:24 crc kubenswrapper[4846]: I1122 10:08:24.026107 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerDied","Data":"7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2"} Nov 22 10:08:25 crc kubenswrapper[4846]: I1122 10:08:25.042187 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerStarted","Data":"2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8"} Nov 22 10:08:26 crc kubenswrapper[4846]: I1122 10:08:26.075997 4846 generic.go:334] "Generic (PLEG): container finished" podID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerID="2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8" exitCode=0 Nov 22 10:08:26 crc kubenswrapper[4846]: I1122 10:08:26.076138 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerDied","Data":"2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8"} Nov 22 10:08:27 crc kubenswrapper[4846]: I1122 10:08:27.090491 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerStarted","Data":"1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0"} Nov 22 10:08:27 crc kubenswrapper[4846]: I1122 10:08:27.114366 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4cx8" podStartSLOduration=3.681947285 podStartE2EDuration="6.114348386s" podCreationTimestamp="2025-11-22 10:08:21 +0000 UTC" firstStartedPulling="2025-11-22 10:08:24.028495547 +0000 UTC m=+3278.964185206" lastFinishedPulling="2025-11-22 10:08:26.460896648 +0000 UTC m=+3281.396586307" observedRunningTime="2025-11-22 10:08:27.111210016 +0000 UTC m=+3282.046899655" watchObservedRunningTime="2025-11-22 10:08:27.114348386 +0000 UTC m=+3282.050038035" Nov 22 10:08:28 crc kubenswrapper[4846]: I1122 10:08:28.036199 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:08:28 crc kubenswrapper[4846]: E1122 10:08:28.036555 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:08:32 crc kubenswrapper[4846]: I1122 10:08:32.348912 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:32 crc kubenswrapper[4846]: I1122 10:08:32.349483 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:33 crc kubenswrapper[4846]: I1122 10:08:33.421363 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n4cx8" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="registry-server" probeResult="failure" output=< Nov 22 10:08:33 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 10:08:33 crc kubenswrapper[4846]: > Nov 22 10:08:40 crc kubenswrapper[4846]: I1122 10:08:40.035930 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:08:41 crc kubenswrapper[4846]: I1122 10:08:41.255416 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"e34fc36b8e8ae993fb618a9e6c830552331b6d7c36106ffc28661b8ddbdc215f"} Nov 22 10:08:42 crc kubenswrapper[4846]: I1122 10:08:42.416526 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:42 crc kubenswrapper[4846]: I1122 10:08:42.489610 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:42 crc kubenswrapper[4846]: I1122 10:08:42.914365 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4cx8"] Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.289511 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4cx8" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="registry-server" containerID="cri-o://1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0" gracePeriod=2 Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.872606 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.986025 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-utilities\") pod \"36aa0016-1f8e-4537-93bd-dd8bdec50954\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.986269 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbbvk\" (UniqueName: \"kubernetes.io/projected/36aa0016-1f8e-4537-93bd-dd8bdec50954-kube-api-access-pbbvk\") pod \"36aa0016-1f8e-4537-93bd-dd8bdec50954\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.986393 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-catalog-content\") pod \"36aa0016-1f8e-4537-93bd-dd8bdec50954\" (UID: \"36aa0016-1f8e-4537-93bd-dd8bdec50954\") " Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.987279 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-utilities" (OuterVolumeSpecName: "utilities") pod "36aa0016-1f8e-4537-93bd-dd8bdec50954" (UID: "36aa0016-1f8e-4537-93bd-dd8bdec50954"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:08:44 crc kubenswrapper[4846]: I1122 10:08:44.994445 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36aa0016-1f8e-4537-93bd-dd8bdec50954-kube-api-access-pbbvk" (OuterVolumeSpecName: "kube-api-access-pbbvk") pod "36aa0016-1f8e-4537-93bd-dd8bdec50954" (UID: "36aa0016-1f8e-4537-93bd-dd8bdec50954"). InnerVolumeSpecName "kube-api-access-pbbvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.084616 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36aa0016-1f8e-4537-93bd-dd8bdec50954" (UID: "36aa0016-1f8e-4537-93bd-dd8bdec50954"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.088554 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.088589 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36aa0016-1f8e-4537-93bd-dd8bdec50954-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.088601 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbbvk\" (UniqueName: \"kubernetes.io/projected/36aa0016-1f8e-4537-93bd-dd8bdec50954-kube-api-access-pbbvk\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.304168 4846 generic.go:334] "Generic (PLEG): container finished" podID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerID="1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0" exitCode=0 Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.304226 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerDied","Data":"1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0"} Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.304306 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4cx8" event={"ID":"36aa0016-1f8e-4537-93bd-dd8bdec50954","Type":"ContainerDied","Data":"eed05fe2b32fbbe911439d49298bb435711f9c5e0a947a98c0e7252c8b47a59c"} Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.304245 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4cx8" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.304343 4846 scope.go:117] "RemoveContainer" containerID="1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.349092 4846 scope.go:117] "RemoveContainer" containerID="2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.351229 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4cx8"] Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.370448 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4cx8"] Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.387536 4846 scope.go:117] "RemoveContainer" containerID="7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.436838 4846 scope.go:117] "RemoveContainer" containerID="1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0" Nov 22 10:08:45 crc kubenswrapper[4846]: E1122 10:08:45.437655 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0\": container with ID starting with 1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0 not found: ID does not exist" containerID="1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.437697 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0"} err="failed to get container status \"1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0\": rpc error: code = NotFound desc = could not find container \"1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0\": container with ID starting with 1993e287230662e8da89f835ec2a46847eef03f36c573de460d11b559b04bbc0 not found: ID does not exist" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.437728 4846 scope.go:117] "RemoveContainer" containerID="2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8" Nov 22 10:08:45 crc kubenswrapper[4846]: E1122 10:08:45.438223 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8\": container with ID starting with 2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8 not found: ID does not exist" containerID="2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.438320 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8"} err="failed to get container status \"2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8\": rpc error: code = NotFound desc = could not find container \"2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8\": container with ID starting with 2173d51c2954e1cfc65674614787daaedce47522770d2120abe3c5361f0365e8 not found: ID does not exist" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.438389 4846 scope.go:117] "RemoveContainer" containerID="7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2" Nov 22 10:08:45 crc kubenswrapper[4846]: E1122 10:08:45.439696 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2\": container with ID starting with 7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2 not found: ID does not exist" containerID="7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2" Nov 22 10:08:45 crc kubenswrapper[4846]: I1122 10:08:45.439754 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2"} err="failed to get container status \"7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2\": rpc error: code = NotFound desc = could not find container \"7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2\": container with ID starting with 7a58e33f34b6543316cf1cf2feef566d6d212e44b8e62fc4a5fbd84878e79ba2 not found: ID does not exist" Nov 22 10:08:46 crc kubenswrapper[4846]: I1122 10:08:46.066234 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" path="/var/lib/kubelet/pods/36aa0016-1f8e-4537-93bd-dd8bdec50954/volumes" Nov 22 10:10:58 crc kubenswrapper[4846]: I1122 10:10:58.625986 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:10:58 crc kubenswrapper[4846]: I1122 10:10:58.626656 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.231197 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n69vg"] Nov 22 10:11:13 crc kubenswrapper[4846]: E1122 10:11:13.232403 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="extract-content" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.232427 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="extract-content" Nov 22 10:11:13 crc kubenswrapper[4846]: E1122 10:11:13.232463 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="extract-utilities" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.232476 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="extract-utilities" Nov 22 10:11:13 crc kubenswrapper[4846]: E1122 10:11:13.232503 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="registry-server" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.232516 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="registry-server" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.232876 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="36aa0016-1f8e-4537-93bd-dd8bdec50954" containerName="registry-server" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.235437 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.249081 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69vg"] Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.299651 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-catalog-content\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.299739 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-utilities\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.299799 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgh47\" (UniqueName: \"kubernetes.io/projected/d2d14ccf-7625-42a3-9f06-6706bef261aa-kube-api-access-rgh47\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.401870 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-catalog-content\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.401924 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-utilities\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.401951 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgh47\" (UniqueName: \"kubernetes.io/projected/d2d14ccf-7625-42a3-9f06-6706bef261aa-kube-api-access-rgh47\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.402654 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-utilities\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.402698 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-catalog-content\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.425018 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgh47\" (UniqueName: \"kubernetes.io/projected/d2d14ccf-7625-42a3-9f06-6706bef261aa-kube-api-access-rgh47\") pod \"redhat-marketplace-n69vg\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:13 crc kubenswrapper[4846]: I1122 10:11:13.583447 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:14 crc kubenswrapper[4846]: I1122 10:11:14.073503 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69vg"] Nov 22 10:11:14 crc kubenswrapper[4846]: W1122 10:11:14.076720 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d14ccf_7625_42a3_9f06_6706bef261aa.slice/crio-9bedf435e9897a5236e9be82681c2f7d089947bc01d14a37f343859fd9f8a868 WatchSource:0}: Error finding container 9bedf435e9897a5236e9be82681c2f7d089947bc01d14a37f343859fd9f8a868: Status 404 returned error can't find the container with id 9bedf435e9897a5236e9be82681c2f7d089947bc01d14a37f343859fd9f8a868 Nov 22 10:11:15 crc kubenswrapper[4846]: I1122 10:11:15.058157 4846 generic.go:334] "Generic (PLEG): container finished" podID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerID="a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691" exitCode=0 Nov 22 10:11:15 crc kubenswrapper[4846]: I1122 10:11:15.058228 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69vg" event={"ID":"d2d14ccf-7625-42a3-9f06-6706bef261aa","Type":"ContainerDied","Data":"a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691"} Nov 22 10:11:15 crc kubenswrapper[4846]: I1122 10:11:15.058507 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69vg" event={"ID":"d2d14ccf-7625-42a3-9f06-6706bef261aa","Type":"ContainerStarted","Data":"9bedf435e9897a5236e9be82681c2f7d089947bc01d14a37f343859fd9f8a868"} Nov 22 10:11:15 crc kubenswrapper[4846]: I1122 10:11:15.061089 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:11:16 crc kubenswrapper[4846]: I1122 10:11:16.069021 4846 generic.go:334] "Generic (PLEG): container finished" podID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerID="764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522" exitCode=0 Nov 22 10:11:16 crc kubenswrapper[4846]: I1122 10:11:16.069163 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69vg" event={"ID":"d2d14ccf-7625-42a3-9f06-6706bef261aa","Type":"ContainerDied","Data":"764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522"} Nov 22 10:11:16 crc kubenswrapper[4846]: E1122 10:11:16.164501 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d14ccf_7625_42a3_9f06_6706bef261aa.slice/crio-conmon-764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:11:17 crc kubenswrapper[4846]: I1122 10:11:17.091016 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69vg" event={"ID":"d2d14ccf-7625-42a3-9f06-6706bef261aa","Type":"ContainerStarted","Data":"4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f"} Nov 22 10:11:17 crc kubenswrapper[4846]: I1122 10:11:17.140377 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n69vg" podStartSLOduration=2.725359663 podStartE2EDuration="4.140337625s" podCreationTimestamp="2025-11-22 10:11:13 +0000 UTC" firstStartedPulling="2025-11-22 10:11:15.060788044 +0000 UTC m=+3449.996477693" lastFinishedPulling="2025-11-22 10:11:16.475765996 +0000 UTC m=+3451.411455655" observedRunningTime="2025-11-22 10:11:17.108264304 +0000 UTC m=+3452.043953953" watchObservedRunningTime="2025-11-22 10:11:17.140337625 +0000 UTC m=+3452.076027294" Nov 22 10:11:23 crc kubenswrapper[4846]: I1122 10:11:23.584124 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:23 crc kubenswrapper[4846]: I1122 10:11:23.584937 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:23 crc kubenswrapper[4846]: I1122 10:11:23.688458 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:24 crc kubenswrapper[4846]: I1122 10:11:24.241855 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:24 crc kubenswrapper[4846]: I1122 10:11:24.294126 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69vg"] Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.188586 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n69vg" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="registry-server" containerID="cri-o://4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f" gracePeriod=2 Nov 22 10:11:26 crc kubenswrapper[4846]: E1122 10:11:26.451454 4846 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d14ccf_7625_42a3_9f06_6706bef261aa.slice/crio-4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d14ccf_7625_42a3_9f06_6706bef261aa.slice/crio-conmon-4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.717193 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.867009 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-catalog-content\") pod \"d2d14ccf-7625-42a3-9f06-6706bef261aa\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.867129 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-utilities\") pod \"d2d14ccf-7625-42a3-9f06-6706bef261aa\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.867213 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgh47\" (UniqueName: \"kubernetes.io/projected/d2d14ccf-7625-42a3-9f06-6706bef261aa-kube-api-access-rgh47\") pod \"d2d14ccf-7625-42a3-9f06-6706bef261aa\" (UID: \"d2d14ccf-7625-42a3-9f06-6706bef261aa\") " Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.868173 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-utilities" (OuterVolumeSpecName: "utilities") pod "d2d14ccf-7625-42a3-9f06-6706bef261aa" (UID: "d2d14ccf-7625-42a3-9f06-6706bef261aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.873309 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d14ccf-7625-42a3-9f06-6706bef261aa-kube-api-access-rgh47" (OuterVolumeSpecName: "kube-api-access-rgh47") pod "d2d14ccf-7625-42a3-9f06-6706bef261aa" (UID: "d2d14ccf-7625-42a3-9f06-6706bef261aa"). InnerVolumeSpecName "kube-api-access-rgh47". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.900693 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2d14ccf-7625-42a3-9f06-6706bef261aa" (UID: "d2d14ccf-7625-42a3-9f06-6706bef261aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.969588 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgh47\" (UniqueName: \"kubernetes.io/projected/d2d14ccf-7625-42a3-9f06-6706bef261aa-kube-api-access-rgh47\") on node \"crc\" DevicePath \"\"" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.969637 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:11:26 crc kubenswrapper[4846]: I1122 10:11:26.969656 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2d14ccf-7625-42a3-9f06-6706bef261aa-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.199384 4846 generic.go:334] "Generic (PLEG): container finished" podID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerID="4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f" exitCode=0 Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.199428 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69vg" event={"ID":"d2d14ccf-7625-42a3-9f06-6706bef261aa","Type":"ContainerDied","Data":"4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f"} Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.200356 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n69vg" event={"ID":"d2d14ccf-7625-42a3-9f06-6706bef261aa","Type":"ContainerDied","Data":"9bedf435e9897a5236e9be82681c2f7d089947bc01d14a37f343859fd9f8a868"} Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.199524 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n69vg" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.200437 4846 scope.go:117] "RemoveContainer" containerID="4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.232825 4846 scope.go:117] "RemoveContainer" containerID="764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.242296 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69vg"] Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.252911 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n69vg"] Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.265435 4846 scope.go:117] "RemoveContainer" containerID="a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.300516 4846 scope.go:117] "RemoveContainer" containerID="4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f" Nov 22 10:11:27 crc kubenswrapper[4846]: E1122 10:11:27.300999 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f\": container with ID starting with 4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f not found: ID does not exist" containerID="4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.301070 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f"} err="failed to get container status \"4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f\": rpc error: code = NotFound desc = could not find container \"4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f\": container with ID starting with 4f6e4f579dbf6ddee0df2011373190c4b14a205ff0d862014f94fe04cd16634f not found: ID does not exist" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.301108 4846 scope.go:117] "RemoveContainer" containerID="764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522" Nov 22 10:11:27 crc kubenswrapper[4846]: E1122 10:11:27.301717 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522\": container with ID starting with 764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522 not found: ID does not exist" containerID="764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.301773 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522"} err="failed to get container status \"764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522\": rpc error: code = NotFound desc = could not find container \"764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522\": container with ID starting with 764b278dae965189a02d77e9080ca80de2fc30e943f03d0041635bdc07fce522 not found: ID does not exist" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.301811 4846 scope.go:117] "RemoveContainer" containerID="a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691" Nov 22 10:11:27 crc kubenswrapper[4846]: E1122 10:11:27.302478 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691\": container with ID starting with a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691 not found: ID does not exist" containerID="a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691" Nov 22 10:11:27 crc kubenswrapper[4846]: I1122 10:11:27.302509 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691"} err="failed to get container status \"a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691\": rpc error: code = NotFound desc = could not find container \"a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691\": container with ID starting with a7411ff00af3873a139822f08b03e2cafc453ccce7395af9dac8d0db4ec58691 not found: ID does not exist" Nov 22 10:11:28 crc kubenswrapper[4846]: I1122 10:11:28.046442 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" path="/var/lib/kubelet/pods/d2d14ccf-7625-42a3-9f06-6706bef261aa/volumes" Nov 22 10:11:28 crc kubenswrapper[4846]: I1122 10:11:28.626074 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:11:28 crc kubenswrapper[4846]: I1122 10:11:28.626155 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:11:58 crc kubenswrapper[4846]: I1122 10:11:58.625734 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:11:58 crc kubenswrapper[4846]: I1122 10:11:58.626281 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:11:58 crc kubenswrapper[4846]: I1122 10:11:58.626332 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 10:11:58 crc kubenswrapper[4846]: I1122 10:11:58.627238 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e34fc36b8e8ae993fb618a9e6c830552331b6d7c36106ffc28661b8ddbdc215f"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:11:58 crc kubenswrapper[4846]: I1122 10:11:58.627310 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://e34fc36b8e8ae993fb618a9e6c830552331b6d7c36106ffc28661b8ddbdc215f" gracePeriod=600 Nov 22 10:11:59 crc kubenswrapper[4846]: I1122 10:11:59.528927 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="e34fc36b8e8ae993fb618a9e6c830552331b6d7c36106ffc28661b8ddbdc215f" exitCode=0 Nov 22 10:11:59 crc kubenswrapper[4846]: I1122 10:11:59.529017 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"e34fc36b8e8ae993fb618a9e6c830552331b6d7c36106ffc28661b8ddbdc215f"} Nov 22 10:11:59 crc kubenswrapper[4846]: I1122 10:11:59.529635 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88"} Nov 22 10:11:59 crc kubenswrapper[4846]: I1122 10:11:59.529665 4846 scope.go:117] "RemoveContainer" containerID="7a4d6217195c8f38b91be467594f896a29fa10f4bf161a19c5071fcabef5a70e" Nov 22 10:13:44 crc kubenswrapper[4846]: I1122 10:13:44.522548 4846 generic.go:334] "Generic (PLEG): container finished" podID="0746377b-0ff5-4289-b4b6-1e9c3a166533" containerID="1b00de3d4c7bcaba4ef69cb16114114bd776cba7e9962ad5d60019e66ca5eb52" exitCode=0 Nov 22 10:13:44 crc kubenswrapper[4846]: I1122 10:13:44.522599 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0746377b-0ff5-4289-b4b6-1e9c3a166533","Type":"ContainerDied","Data":"1b00de3d4c7bcaba4ef69cb16114114bd776cba7e9962ad5d60019e66ca5eb52"} Nov 22 10:13:45 crc kubenswrapper[4846]: I1122 10:13:45.921430 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.089584 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-workdir\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090422 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-config-data\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090511 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29xdz\" (UniqueName: \"kubernetes.io/projected/0746377b-0ff5-4289-b4b6-1e9c3a166533-kube-api-access-29xdz\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090567 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ssh-key\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090663 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config-secret\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090708 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090806 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.090924 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ca-certs\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.091003 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-temporary\") pod \"0746377b-0ff5-4289-b4b6-1e9c3a166533\" (UID: \"0746377b-0ff5-4289-b4b6-1e9c3a166533\") " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.091348 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-config-data" (OuterVolumeSpecName: "config-data") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.092656 4846 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.093286 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.099711 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0746377b-0ff5-4289-b4b6-1e9c3a166533-kube-api-access-29xdz" (OuterVolumeSpecName: "kube-api-access-29xdz") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "kube-api-access-29xdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.102780 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.103304 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.123960 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.138149 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.143071 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.161416 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0746377b-0ff5-4289-b4b6-1e9c3a166533" (UID: "0746377b-0ff5-4289-b4b6-1e9c3a166533"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.195274 4846 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196210 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29xdz\" (UniqueName: \"kubernetes.io/projected/0746377b-0ff5-4289-b4b6-1e9c3a166533-kube-api-access-29xdz\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196437 4846 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196519 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196605 4846 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196669 4846 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0746377b-0ff5-4289-b4b6-1e9c3a166533-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196775 4846 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0746377b-0ff5-4289-b4b6-1e9c3a166533-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.196836 4846 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0746377b-0ff5-4289-b4b6-1e9c3a166533-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.230002 4846 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.299550 4846 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.548169 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0746377b-0ff5-4289-b4b6-1e9c3a166533","Type":"ContainerDied","Data":"38c1e84ddbd03357e2f916e54cd964bd4f0ec7f2813509552f29cfc461931e1e"} Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.548275 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38c1e84ddbd03357e2f916e54cd964bd4f0ec7f2813509552f29cfc461931e1e" Nov 22 10:13:46 crc kubenswrapper[4846]: I1122 10:13:46.548282 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.080358 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 10:13:49 crc kubenswrapper[4846]: E1122 10:13:49.081190 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="extract-content" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.081205 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="extract-content" Nov 22 10:13:49 crc kubenswrapper[4846]: E1122 10:13:49.081230 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="registry-server" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.081236 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="registry-server" Nov 22 10:13:49 crc kubenswrapper[4846]: E1122 10:13:49.081251 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="extract-utilities" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.081257 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="extract-utilities" Nov 22 10:13:49 crc kubenswrapper[4846]: E1122 10:13:49.081270 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0746377b-0ff5-4289-b4b6-1e9c3a166533" containerName="tempest-tests-tempest-tests-runner" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.081278 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0746377b-0ff5-4289-b4b6-1e9c3a166533" containerName="tempest-tests-tempest-tests-runner" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.081484 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d14ccf-7625-42a3-9f06-6706bef261aa" containerName="registry-server" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.081498 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0746377b-0ff5-4289-b4b6-1e9c3a166533" containerName="tempest-tests-tempest-tests-runner" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.082181 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.084950 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ttc5q" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.089366 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.254923 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.255114 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqm7\" (UniqueName: \"kubernetes.io/projected/423242c9-5d5f-4a1d-83db-13989d8d78b1-kube-api-access-rgqm7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.356491 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqm7\" (UniqueName: \"kubernetes.io/projected/423242c9-5d5f-4a1d-83db-13989d8d78b1-kube-api-access-rgqm7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.356874 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.357508 4846 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.388382 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.389427 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqm7\" (UniqueName: \"kubernetes.io/projected/423242c9-5d5f-4a1d-83db-13989d8d78b1-kube-api-access-rgqm7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"423242c9-5d5f-4a1d-83db-13989d8d78b1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.410430 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 10:13:49 crc kubenswrapper[4846]: I1122 10:13:49.891484 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 10:13:49 crc kubenswrapper[4846]: W1122 10:13:49.902802 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod423242c9_5d5f_4a1d_83db_13989d8d78b1.slice/crio-4b8e2e82bd5dd008eaa3bb36a4ef845cfe186e22bd1ce606225f3ddc9a47958d WatchSource:0}: Error finding container 4b8e2e82bd5dd008eaa3bb36a4ef845cfe186e22bd1ce606225f3ddc9a47958d: Status 404 returned error can't find the container with id 4b8e2e82bd5dd008eaa3bb36a4ef845cfe186e22bd1ce606225f3ddc9a47958d Nov 22 10:13:50 crc kubenswrapper[4846]: I1122 10:13:50.591829 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"423242c9-5d5f-4a1d-83db-13989d8d78b1","Type":"ContainerStarted","Data":"4b8e2e82bd5dd008eaa3bb36a4ef845cfe186e22bd1ce606225f3ddc9a47958d"} Nov 22 10:13:51 crc kubenswrapper[4846]: I1122 10:13:51.604393 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"423242c9-5d5f-4a1d-83db-13989d8d78b1","Type":"ContainerStarted","Data":"309d43451c939cf5a440420d1e0cd6c6e700f00ec74625b73e8fc1f52b0828cd"} Nov 22 10:13:51 crc kubenswrapper[4846]: I1122 10:13:51.621718 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.794440792 podStartE2EDuration="2.621695779s" podCreationTimestamp="2025-11-22 10:13:49 +0000 UTC" firstStartedPulling="2025-11-22 10:13:49.906487964 +0000 UTC m=+3604.842177623" lastFinishedPulling="2025-11-22 10:13:50.733742961 +0000 UTC m=+3605.669432610" observedRunningTime="2025-11-22 10:13:51.620231128 +0000 UTC m=+3606.555920817" watchObservedRunningTime="2025-11-22 10:13:51.621695779 +0000 UTC m=+3606.557385418" Nov 22 10:13:58 crc kubenswrapper[4846]: I1122 10:13:58.625842 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:13:58 crc kubenswrapper[4846]: I1122 10:13:58.626480 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.641429 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5szc/must-gather-d7tx8"] Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.643937 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.645958 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c5szc"/"kube-root-ca.crt" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.646121 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c5szc"/"openshift-service-ca.crt" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.646203 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c5szc"/"default-dockercfg-lzmsf" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.680056 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c5szc/must-gather-d7tx8"] Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.774824 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e321743e-6fcf-4416-aadd-078013511625-must-gather-output\") pod \"must-gather-d7tx8\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.774925 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxd4\" (UniqueName: \"kubernetes.io/projected/e321743e-6fcf-4416-aadd-078013511625-kube-api-access-ksxd4\") pod \"must-gather-d7tx8\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.876394 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e321743e-6fcf-4416-aadd-078013511625-must-gather-output\") pod \"must-gather-d7tx8\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.876486 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxd4\" (UniqueName: \"kubernetes.io/projected/e321743e-6fcf-4416-aadd-078013511625-kube-api-access-ksxd4\") pod \"must-gather-d7tx8\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.876887 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e321743e-6fcf-4416-aadd-078013511625-must-gather-output\") pod \"must-gather-d7tx8\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.893369 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxd4\" (UniqueName: \"kubernetes.io/projected/e321743e-6fcf-4416-aadd-078013511625-kube-api-access-ksxd4\") pod \"must-gather-d7tx8\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:13 crc kubenswrapper[4846]: I1122 10:14:13.972343 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:14:14 crc kubenswrapper[4846]: I1122 10:14:14.446034 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c5szc/must-gather-d7tx8"] Nov 22 10:14:14 crc kubenswrapper[4846]: I1122 10:14:14.846201 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/must-gather-d7tx8" event={"ID":"e321743e-6fcf-4416-aadd-078013511625","Type":"ContainerStarted","Data":"2bdc8f4d7a6dd50c6de9f0614f5a08fdaa0fe65031f96bf99060d63b2ec7d541"} Nov 22 10:14:21 crc kubenswrapper[4846]: I1122 10:14:21.916444 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/must-gather-d7tx8" event={"ID":"e321743e-6fcf-4416-aadd-078013511625","Type":"ContainerStarted","Data":"e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd"} Nov 22 10:14:21 crc kubenswrapper[4846]: I1122 10:14:21.916825 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/must-gather-d7tx8" event={"ID":"e321743e-6fcf-4416-aadd-078013511625","Type":"ContainerStarted","Data":"1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd"} Nov 22 10:14:21 crc kubenswrapper[4846]: I1122 10:14:21.937741 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c5szc/must-gather-d7tx8" podStartSLOduration=2.684033801 podStartE2EDuration="8.937717857s" podCreationTimestamp="2025-11-22 10:14:13 +0000 UTC" firstStartedPulling="2025-11-22 10:14:14.451303503 +0000 UTC m=+3629.386993162" lastFinishedPulling="2025-11-22 10:14:20.704987539 +0000 UTC m=+3635.640677218" observedRunningTime="2025-11-22 10:14:21.937651225 +0000 UTC m=+3636.873340914" watchObservedRunningTime="2025-11-22 10:14:21.937717857 +0000 UTC m=+3636.873407506" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.600554 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5szc/crc-debug-dtdsx"] Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.602545 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.698254 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-host\") pod \"crc-debug-dtdsx\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.698570 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4lv\" (UniqueName: \"kubernetes.io/projected/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-kube-api-access-8w4lv\") pod \"crc-debug-dtdsx\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.800226 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4lv\" (UniqueName: \"kubernetes.io/projected/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-kube-api-access-8w4lv\") pod \"crc-debug-dtdsx\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.800578 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-host\") pod \"crc-debug-dtdsx\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.800692 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-host\") pod \"crc-debug-dtdsx\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.820304 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4lv\" (UniqueName: \"kubernetes.io/projected/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-kube-api-access-8w4lv\") pod \"crc-debug-dtdsx\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: I1122 10:14:24.926961 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:14:24 crc kubenswrapper[4846]: W1122 10:14:24.965566 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31fa85d3_b6c4_46c4_bfa7_af8ca2e9b425.slice/crio-a937e5f8a4764993b24aadbc77fad09cc69670f63b7ac5ddeb96b1d95f2cf0fa WatchSource:0}: Error finding container a937e5f8a4764993b24aadbc77fad09cc69670f63b7ac5ddeb96b1d95f2cf0fa: Status 404 returned error can't find the container with id a937e5f8a4764993b24aadbc77fad09cc69670f63b7ac5ddeb96b1d95f2cf0fa Nov 22 10:14:25 crc kubenswrapper[4846]: I1122 10:14:25.959752 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" event={"ID":"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425","Type":"ContainerStarted","Data":"a937e5f8a4764993b24aadbc77fad09cc69670f63b7ac5ddeb96b1d95f2cf0fa"} Nov 22 10:14:28 crc kubenswrapper[4846]: I1122 10:14:28.625136 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:14:28 crc kubenswrapper[4846]: I1122 10:14:28.625683 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:14:37 crc kubenswrapper[4846]: I1122 10:14:37.063104 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" event={"ID":"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425","Type":"ContainerStarted","Data":"0aa48728a1bc5805aa027f708a954a7c2f4c348c3eb5c0298946b6c915a2a302"} Nov 22 10:14:37 crc kubenswrapper[4846]: I1122 10:14:37.084123 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" podStartSLOduration=1.5210703570000002 podStartE2EDuration="13.084103454s" podCreationTimestamp="2025-11-22 10:14:24 +0000 UTC" firstStartedPulling="2025-11-22 10:14:24.967843233 +0000 UTC m=+3639.903532882" lastFinishedPulling="2025-11-22 10:14:36.53087633 +0000 UTC m=+3651.466565979" observedRunningTime="2025-11-22 10:14:37.077637739 +0000 UTC m=+3652.013327378" watchObservedRunningTime="2025-11-22 10:14:37.084103454 +0000 UTC m=+3652.019793103" Nov 22 10:14:58 crc kubenswrapper[4846]: I1122 10:14:58.625554 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:14:58 crc kubenswrapper[4846]: I1122 10:14:58.626240 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:14:58 crc kubenswrapper[4846]: I1122 10:14:58.626302 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 10:14:58 crc kubenswrapper[4846]: I1122 10:14:58.627147 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:14:58 crc kubenswrapper[4846]: I1122 10:14:58.627219 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" gracePeriod=600 Nov 22 10:14:58 crc kubenswrapper[4846]: E1122 10:14:58.772308 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:14:59 crc kubenswrapper[4846]: I1122 10:14:59.295085 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" exitCode=0 Nov 22 10:14:59 crc kubenswrapper[4846]: I1122 10:14:59.295141 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88"} Nov 22 10:14:59 crc kubenswrapper[4846]: I1122 10:14:59.295191 4846 scope.go:117] "RemoveContainer" containerID="e34fc36b8e8ae993fb618a9e6c830552331b6d7c36106ffc28661b8ddbdc215f" Nov 22 10:14:59 crc kubenswrapper[4846]: I1122 10:14:59.295966 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:14:59 crc kubenswrapper[4846]: E1122 10:14:59.296322 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.180457 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw"] Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.182119 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.185995 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.186392 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.192162 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw"] Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.336546 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cad888-0bd9-459d-a0fe-77386d6376a3-config-volume\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.336638 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srbd\" (UniqueName: \"kubernetes.io/projected/27cad888-0bd9-459d-a0fe-77386d6376a3-kube-api-access-4srbd\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.336844 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cad888-0bd9-459d-a0fe-77386d6376a3-secret-volume\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.438738 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cad888-0bd9-459d-a0fe-77386d6376a3-config-volume\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.438868 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srbd\" (UniqueName: \"kubernetes.io/projected/27cad888-0bd9-459d-a0fe-77386d6376a3-kube-api-access-4srbd\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.438917 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cad888-0bd9-459d-a0fe-77386d6376a3-secret-volume\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.439821 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cad888-0bd9-459d-a0fe-77386d6376a3-config-volume\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.451881 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cad888-0bd9-459d-a0fe-77386d6376a3-secret-volume\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.461315 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srbd\" (UniqueName: \"kubernetes.io/projected/27cad888-0bd9-459d-a0fe-77386d6376a3-kube-api-access-4srbd\") pod \"collect-profiles-29396775-c5ggw\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.499667 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:00 crc kubenswrapper[4846]: I1122 10:15:00.949995 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw"] Nov 22 10:15:01 crc kubenswrapper[4846]: I1122 10:15:01.316433 4846 generic.go:334] "Generic (PLEG): container finished" podID="27cad888-0bd9-459d-a0fe-77386d6376a3" containerID="3f48e247bd00fcc75e127ea26d949838a86bfc4f626b50315c01a15d30f31904" exitCode=0 Nov 22 10:15:01 crc kubenswrapper[4846]: I1122 10:15:01.316534 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" event={"ID":"27cad888-0bd9-459d-a0fe-77386d6376a3","Type":"ContainerDied","Data":"3f48e247bd00fcc75e127ea26d949838a86bfc4f626b50315c01a15d30f31904"} Nov 22 10:15:01 crc kubenswrapper[4846]: I1122 10:15:01.316779 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" event={"ID":"27cad888-0bd9-459d-a0fe-77386d6376a3","Type":"ContainerStarted","Data":"856209374c56f4efa9dab3ad18a9fd3235673588e47bcbc72b69d8d7a7ccc905"} Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.702650 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.895596 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cad888-0bd9-459d-a0fe-77386d6376a3-secret-volume\") pod \"27cad888-0bd9-459d-a0fe-77386d6376a3\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.895849 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cad888-0bd9-459d-a0fe-77386d6376a3-config-volume\") pod \"27cad888-0bd9-459d-a0fe-77386d6376a3\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.895884 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4srbd\" (UniqueName: \"kubernetes.io/projected/27cad888-0bd9-459d-a0fe-77386d6376a3-kube-api-access-4srbd\") pod \"27cad888-0bd9-459d-a0fe-77386d6376a3\" (UID: \"27cad888-0bd9-459d-a0fe-77386d6376a3\") " Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.902532 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cad888-0bd9-459d-a0fe-77386d6376a3-kube-api-access-4srbd" (OuterVolumeSpecName: "kube-api-access-4srbd") pod "27cad888-0bd9-459d-a0fe-77386d6376a3" (UID: "27cad888-0bd9-459d-a0fe-77386d6376a3"). InnerVolumeSpecName "kube-api-access-4srbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.905175 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cad888-0bd9-459d-a0fe-77386d6376a3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27cad888-0bd9-459d-a0fe-77386d6376a3" (UID: "27cad888-0bd9-459d-a0fe-77386d6376a3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.905203 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27cad888-0bd9-459d-a0fe-77386d6376a3-config-volume" (OuterVolumeSpecName: "config-volume") pod "27cad888-0bd9-459d-a0fe-77386d6376a3" (UID: "27cad888-0bd9-459d-a0fe-77386d6376a3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.998783 4846 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27cad888-0bd9-459d-a0fe-77386d6376a3-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.998815 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27cad888-0bd9-459d-a0fe-77386d6376a3-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:02 crc kubenswrapper[4846]: I1122 10:15:02.998826 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4srbd\" (UniqueName: \"kubernetes.io/projected/27cad888-0bd9-459d-a0fe-77386d6376a3-kube-api-access-4srbd\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:03 crc kubenswrapper[4846]: I1122 10:15:03.342489 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" event={"ID":"27cad888-0bd9-459d-a0fe-77386d6376a3","Type":"ContainerDied","Data":"856209374c56f4efa9dab3ad18a9fd3235673588e47bcbc72b69d8d7a7ccc905"} Nov 22 10:15:03 crc kubenswrapper[4846]: I1122 10:15:03.342540 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856209374c56f4efa9dab3ad18a9fd3235673588e47bcbc72b69d8d7a7ccc905" Nov 22 10:15:03 crc kubenswrapper[4846]: I1122 10:15:03.342590 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-c5ggw" Nov 22 10:15:03 crc kubenswrapper[4846]: I1122 10:15:03.783032 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p"] Nov 22 10:15:03 crc kubenswrapper[4846]: I1122 10:15:03.794113 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-sf48p"] Nov 22 10:15:04 crc kubenswrapper[4846]: I1122 10:15:04.073832 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea037c55-edbb-4c60-ab5e-7955eafa3139" path="/var/lib/kubelet/pods/ea037c55-edbb-4c60-ab5e-7955eafa3139/volumes" Nov 22 10:15:12 crc kubenswrapper[4846]: I1122 10:15:12.035166 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:15:12 crc kubenswrapper[4846]: E1122 10:15:12.036013 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:15:19 crc kubenswrapper[4846]: I1122 10:15:19.499753 4846 generic.go:334] "Generic (PLEG): container finished" podID="31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" containerID="0aa48728a1bc5805aa027f708a954a7c2f4c348c3eb5c0298946b6c915a2a302" exitCode=0 Nov 22 10:15:19 crc kubenswrapper[4846]: I1122 10:15:19.499846 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" event={"ID":"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425","Type":"ContainerDied","Data":"0aa48728a1bc5805aa027f708a954a7c2f4c348c3eb5c0298946b6c915a2a302"} Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.606724 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.634357 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w4lv\" (UniqueName: \"kubernetes.io/projected/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-kube-api-access-8w4lv\") pod \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.634561 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-host\") pod \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\" (UID: \"31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425\") " Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.635092 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-host" (OuterVolumeSpecName: "host") pod "31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" (UID: "31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.635700 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5szc/crc-debug-dtdsx"] Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.641028 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-kube-api-access-8w4lv" (OuterVolumeSpecName: "kube-api-access-8w4lv") pod "31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" (UID: "31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425"). InnerVolumeSpecName "kube-api-access-8w4lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.648555 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5szc/crc-debug-dtdsx"] Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.735801 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w4lv\" (UniqueName: \"kubernetes.io/projected/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-kube-api-access-8w4lv\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:20 crc kubenswrapper[4846]: I1122 10:15:20.735843 4846 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425-host\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.519321 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a937e5f8a4764993b24aadbc77fad09cc69670f63b7ac5ddeb96b1d95f2cf0fa" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.519688 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-dtdsx" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.827475 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5szc/crc-debug-c8ltt"] Nov 22 10:15:21 crc kubenswrapper[4846]: E1122 10:15:21.827950 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cad888-0bd9-459d-a0fe-77386d6376a3" containerName="collect-profiles" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.827966 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cad888-0bd9-459d-a0fe-77386d6376a3" containerName="collect-profiles" Nov 22 10:15:21 crc kubenswrapper[4846]: E1122 10:15:21.827983 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" containerName="container-00" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.827992 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" containerName="container-00" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.828238 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cad888-0bd9-459d-a0fe-77386d6376a3" containerName="collect-profiles" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.828280 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" containerName="container-00" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.828998 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.957869 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtqvq\" (UniqueName: \"kubernetes.io/projected/30769d81-67b5-4bf2-bd86-3dba738f3789-kube-api-access-vtqvq\") pod \"crc-debug-c8ltt\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:21 crc kubenswrapper[4846]: I1122 10:15:21.957983 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30769d81-67b5-4bf2-bd86-3dba738f3789-host\") pod \"crc-debug-c8ltt\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.050415 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425" path="/var/lib/kubelet/pods/31fa85d3-b6c4-46c4-bfa7-af8ca2e9b425/volumes" Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.077770 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtqvq\" (UniqueName: \"kubernetes.io/projected/30769d81-67b5-4bf2-bd86-3dba738f3789-kube-api-access-vtqvq\") pod \"crc-debug-c8ltt\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.078004 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30769d81-67b5-4bf2-bd86-3dba738f3789-host\") pod \"crc-debug-c8ltt\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.078455 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30769d81-67b5-4bf2-bd86-3dba738f3789-host\") pod \"crc-debug-c8ltt\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.095373 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtqvq\" (UniqueName: \"kubernetes.io/projected/30769d81-67b5-4bf2-bd86-3dba738f3789-kube-api-access-vtqvq\") pod \"crc-debug-c8ltt\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.150557 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:22 crc kubenswrapper[4846]: W1122 10:15:22.183805 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30769d81_67b5_4bf2_bd86_3dba738f3789.slice/crio-b587b91811a44ce398a853817d944b1a9810d0fd01dfd3925aa07f335e51f286 WatchSource:0}: Error finding container b587b91811a44ce398a853817d944b1a9810d0fd01dfd3925aa07f335e51f286: Status 404 returned error can't find the container with id b587b91811a44ce398a853817d944b1a9810d0fd01dfd3925aa07f335e51f286 Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.532159 4846 generic.go:334] "Generic (PLEG): container finished" podID="30769d81-67b5-4bf2-bd86-3dba738f3789" containerID="94bc43ce7ac6f9e79fbc149028547cb69dca11dfa3ec7c570dd4da4659759f36" exitCode=0 Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.532295 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-c8ltt" event={"ID":"30769d81-67b5-4bf2-bd86-3dba738f3789","Type":"ContainerDied","Data":"94bc43ce7ac6f9e79fbc149028547cb69dca11dfa3ec7c570dd4da4659759f36"} Nov 22 10:15:22 crc kubenswrapper[4846]: I1122 10:15:22.532533 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-c8ltt" event={"ID":"30769d81-67b5-4bf2-bd86-3dba738f3789","Type":"ContainerStarted","Data":"b587b91811a44ce398a853817d944b1a9810d0fd01dfd3925aa07f335e51f286"} Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.037114 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:15:23 crc kubenswrapper[4846]: E1122 10:15:23.037510 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.042640 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5szc/crc-debug-c8ltt"] Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.052870 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5szc/crc-debug-c8ltt"] Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.649211 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.705917 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30769d81-67b5-4bf2-bd86-3dba738f3789-host\") pod \"30769d81-67b5-4bf2-bd86-3dba738f3789\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.706021 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtqvq\" (UniqueName: \"kubernetes.io/projected/30769d81-67b5-4bf2-bd86-3dba738f3789-kube-api-access-vtqvq\") pod \"30769d81-67b5-4bf2-bd86-3dba738f3789\" (UID: \"30769d81-67b5-4bf2-bd86-3dba738f3789\") " Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.706113 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30769d81-67b5-4bf2-bd86-3dba738f3789-host" (OuterVolumeSpecName: "host") pod "30769d81-67b5-4bf2-bd86-3dba738f3789" (UID: "30769d81-67b5-4bf2-bd86-3dba738f3789"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.706401 4846 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30769d81-67b5-4bf2-bd86-3dba738f3789-host\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.712382 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30769d81-67b5-4bf2-bd86-3dba738f3789-kube-api-access-vtqvq" (OuterVolumeSpecName: "kube-api-access-vtqvq") pod "30769d81-67b5-4bf2-bd86-3dba738f3789" (UID: "30769d81-67b5-4bf2-bd86-3dba738f3789"). InnerVolumeSpecName "kube-api-access-vtqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:15:23 crc kubenswrapper[4846]: I1122 10:15:23.809123 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtqvq\" (UniqueName: \"kubernetes.io/projected/30769d81-67b5-4bf2-bd86-3dba738f3789-kube-api-access-vtqvq\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.049555 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30769d81-67b5-4bf2-bd86-3dba738f3789" path="/var/lib/kubelet/pods/30769d81-67b5-4bf2-bd86-3dba738f3789/volumes" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.271254 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c5szc/crc-debug-6q64z"] Nov 22 10:15:24 crc kubenswrapper[4846]: E1122 10:15:24.271718 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30769d81-67b5-4bf2-bd86-3dba738f3789" containerName="container-00" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.271740 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="30769d81-67b5-4bf2-bd86-3dba738f3789" containerName="container-00" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.272012 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="30769d81-67b5-4bf2-bd86-3dba738f3789" containerName="container-00" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.273006 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.318863 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-host\") pod \"crc-debug-6q64z\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.318954 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmj8\" (UniqueName: \"kubernetes.io/projected/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-kube-api-access-vpmj8\") pod \"crc-debug-6q64z\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.422086 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-host\") pod \"crc-debug-6q64z\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.422219 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmj8\" (UniqueName: \"kubernetes.io/projected/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-kube-api-access-vpmj8\") pod \"crc-debug-6q64z\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.422309 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-host\") pod \"crc-debug-6q64z\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.447881 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmj8\" (UniqueName: \"kubernetes.io/projected/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-kube-api-access-vpmj8\") pod \"crc-debug-6q64z\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.554475 4846 scope.go:117] "RemoveContainer" containerID="94bc43ce7ac6f9e79fbc149028547cb69dca11dfa3ec7c570dd4da4659759f36" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.554726 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-c8ltt" Nov 22 10:15:24 crc kubenswrapper[4846]: I1122 10:15:24.604753 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:24 crc kubenswrapper[4846]: W1122 10:15:24.649055 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f0ba0f0_5a2c_4fef_8452_d837ba04b355.slice/crio-653c58eb32cbe7013342acda3f91b46610c38e6308eeea8baf5def747166134b WatchSource:0}: Error finding container 653c58eb32cbe7013342acda3f91b46610c38e6308eeea8baf5def747166134b: Status 404 returned error can't find the container with id 653c58eb32cbe7013342acda3f91b46610c38e6308eeea8baf5def747166134b Nov 22 10:15:25 crc kubenswrapper[4846]: I1122 10:15:25.591080 4846 generic.go:334] "Generic (PLEG): container finished" podID="0f0ba0f0-5a2c-4fef-8452-d837ba04b355" containerID="8df4c1ae17d6a6fa2a57b2efeba365d2fa030851dda612e04da0a38aeba68c11" exitCode=0 Nov 22 10:15:25 crc kubenswrapper[4846]: I1122 10:15:25.591203 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-6q64z" event={"ID":"0f0ba0f0-5a2c-4fef-8452-d837ba04b355","Type":"ContainerDied","Data":"8df4c1ae17d6a6fa2a57b2efeba365d2fa030851dda612e04da0a38aeba68c11"} Nov 22 10:15:25 crc kubenswrapper[4846]: I1122 10:15:25.591254 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/crc-debug-6q64z" event={"ID":"0f0ba0f0-5a2c-4fef-8452-d837ba04b355","Type":"ContainerStarted","Data":"653c58eb32cbe7013342acda3f91b46610c38e6308eeea8baf5def747166134b"} Nov 22 10:15:25 crc kubenswrapper[4846]: I1122 10:15:25.648143 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5szc/crc-debug-6q64z"] Nov 22 10:15:25 crc kubenswrapper[4846]: I1122 10:15:25.658421 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5szc/crc-debug-6q64z"] Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.725376 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.869789 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpmj8\" (UniqueName: \"kubernetes.io/projected/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-kube-api-access-vpmj8\") pod \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.869833 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-host\") pod \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\" (UID: \"0f0ba0f0-5a2c-4fef-8452-d837ba04b355\") " Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.870297 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-host" (OuterVolumeSpecName: "host") pod "0f0ba0f0-5a2c-4fef-8452-d837ba04b355" (UID: "0f0ba0f0-5a2c-4fef-8452-d837ba04b355"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.878359 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-kube-api-access-vpmj8" (OuterVolumeSpecName: "kube-api-access-vpmj8") pod "0f0ba0f0-5a2c-4fef-8452-d837ba04b355" (UID: "0f0ba0f0-5a2c-4fef-8452-d837ba04b355"). InnerVolumeSpecName "kube-api-access-vpmj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.955308 4846 scope.go:117] "RemoveContainer" containerID="3d0c0541f13a9fa342cfe9bc0ec1351012ca97fb222b4f938c53d635bd749675" Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.974592 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpmj8\" (UniqueName: \"kubernetes.io/projected/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-kube-api-access-vpmj8\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:26 crc kubenswrapper[4846]: I1122 10:15:26.974650 4846 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f0ba0f0-5a2c-4fef-8452-d837ba04b355-host\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:27 crc kubenswrapper[4846]: I1122 10:15:27.611364 4846 scope.go:117] "RemoveContainer" containerID="8df4c1ae17d6a6fa2a57b2efeba365d2fa030851dda612e04da0a38aeba68c11" Nov 22 10:15:27 crc kubenswrapper[4846]: I1122 10:15:27.611581 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/crc-debug-6q64z" Nov 22 10:15:28 crc kubenswrapper[4846]: I1122 10:15:28.044787 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0ba0f0-5a2c-4fef-8452-d837ba04b355" path="/var/lib/kubelet/pods/0f0ba0f0-5a2c-4fef-8452-d837ba04b355/volumes" Nov 22 10:15:37 crc kubenswrapper[4846]: I1122 10:15:37.035875 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:15:37 crc kubenswrapper[4846]: E1122 10:15:37.036762 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.008095 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55fdfc87fd-75r6l_54092b40-6b71-4920-b703-b6b44e0e2331/barbican-api/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.194296 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5ff9f749db-lj4qc_d45bb639-d116-4666-8aea-ba5bc8ca84ea/barbican-keystone-listener/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.227359 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55fdfc87fd-75r6l_54092b40-6b71-4920-b703-b6b44e0e2331/barbican-api-log/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.365078 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5ff9f749db-lj4qc_d45bb639-d116-4666-8aea-ba5bc8ca84ea/barbican-keystone-listener-log/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.402650 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b796967d9-trff5_03409e82-9b6d-43ee-a770-96700e162fac/barbican-worker/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.489614 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b796967d9-trff5_03409e82-9b6d-43ee-a770-96700e162fac/barbican-worker-log/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.650581 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw_2b50be33-843f-4f51-af42-decfb29306c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.747209 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/ceilometer-central-agent/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.823570 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/ceilometer-notification-agent/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.837648 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/proxy-httpd/0.log" Nov 22 10:15:42 crc kubenswrapper[4846]: I1122 10:15:42.929927 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/sg-core/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.077549 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2b86aa01-1c05-47da-9f91-ef71a5e6d7ec/cinder-api/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.099366 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2b86aa01-1c05-47da-9f91-ef71a5e6d7ec/cinder-api-log/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.301429 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ea0ad07d-59fe-4c26-b1a7-69b9181631d8/cinder-scheduler/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.383305 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ea0ad07d-59fe-4c26-b1a7-69b9181631d8/probe/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.494650 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-crlnp_4e8248db-f0c2-40ad-a534-e3076fae3466/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.598648 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-prvrx_3db36453-67bc-491e-b87f-df3a840178b1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.709686 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b22zv_fb7382e7-13c7-4cf5-9462-b58b330e0315/init/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.934851 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b22zv_fb7382e7-13c7-4cf5-9462-b58b330e0315/dnsmasq-dns/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.945647 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-grpjk_ee2ff4f5-0353-438b-850b-81b49a3d22ad/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:43 crc kubenswrapper[4846]: I1122 10:15:43.961150 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b22zv_fb7382e7-13c7-4cf5-9462-b58b330e0315/init/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.130774 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_85455dd3-3442-40ad-bd48-80034e877a41/glance-httpd/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.185677 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_85455dd3-3442-40ad-bd48-80034e877a41/glance-log/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.322516 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_554a6b70-9c9c-4afd-9738-d207b3067a30/glance-log/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.350576 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_554a6b70-9c9c-4afd-9738-d207b3067a30/glance-httpd/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.622596 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dfd5ccb4b-fpl7v_76c862f1-2cb3-4598-9be8-f8ff8bbab6f3/horizon/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.703911 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-78vtn_39001bd9-e368-4530-be7d-97c756cb4d39/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.853321 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dfd5ccb4b-fpl7v_76c862f1-2cb3-4598-9be8-f8ff8bbab6f3/horizon-log/0.log" Nov 22 10:15:44 crc kubenswrapper[4846]: I1122 10:15:44.893368 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bs5cw_c12b9ae4-5d39-4ce1-bca3-8b128038532e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:45 crc kubenswrapper[4846]: I1122 10:15:45.131823 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29396761-j28rw_06e59565-2673-4e50-a150-a4f336c8dbfe/keystone-cron/0.log" Nov 22 10:15:45 crc kubenswrapper[4846]: I1122 10:15:45.190677 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5ccd94b5cf-fd5rp_fe29ba72-dfe7-4536-bf56-c282d31d2acb/keystone-api/0.log" Nov 22 10:15:45 crc kubenswrapper[4846]: I1122 10:15:45.331588 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4377a3fa-e17a-42e4-ab0b-37f76e90dbf9/kube-state-metrics/0.log" Nov 22 10:15:45 crc kubenswrapper[4846]: I1122 10:15:45.402146 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl_06a4ae02-37d7-458b-879a-64951da9e75a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:45 crc kubenswrapper[4846]: I1122 10:15:45.800071 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f98fdfc57-v8bnv_eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4/neutron-api/0.log" Nov 22 10:15:45 crc kubenswrapper[4846]: I1122 10:15:45.928530 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f98fdfc57-v8bnv_eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4/neutron-httpd/0.log" Nov 22 10:15:46 crc kubenswrapper[4846]: I1122 10:15:46.009539 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4_34347b18-5391-4078-8165-175276d8747e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:46 crc kubenswrapper[4846]: I1122 10:15:46.450438 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_525d7ecc-cc33-4162-82f8-bfa33a4b15ed/nova-cell0-conductor-conductor/0.log" Nov 22 10:15:46 crc kubenswrapper[4846]: I1122 10:15:46.527416 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_732fe70d-07f5-455f-b20a-5a4d0c92c764/nova-api-log/0.log" Nov 22 10:15:46 crc kubenswrapper[4846]: I1122 10:15:46.768520 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7b0d3ce0-49e4-4e73-b2e9-ce405a023987/nova-cell1-conductor-conductor/0.log" Nov 22 10:15:46 crc kubenswrapper[4846]: I1122 10:15:46.803286 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_732fe70d-07f5-455f-b20a-5a4d0c92c764/nova-api-api/0.log" Nov 22 10:15:46 crc kubenswrapper[4846]: I1122 10:15:46.869628 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_faa4297f-4d7b-4942-958c-ccc0f3891f2a/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.026843 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nl69g_e51d5d70-b3f1-41e3-b6c4-f3bf9b569417/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.235863 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea/nova-metadata-log/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.458119 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4e66c89-9999-4584-a149-2c18589a522a/mysql-bootstrap/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.464719 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_177c2c54-1d5d-409c-8592-141b25fc59cc/nova-scheduler-scheduler/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.758624 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4e66c89-9999-4584-a149-2c18589a522a/mysql-bootstrap/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.787799 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4e66c89-9999-4584-a149-2c18589a522a/galera/0.log" Nov 22 10:15:47 crc kubenswrapper[4846]: I1122 10:15:47.957801 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93cad534-86a5-4420-951f-859efc86a70a/mysql-bootstrap/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.141651 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93cad534-86a5-4420-951f-859efc86a70a/mysql-bootstrap/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.169535 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93cad534-86a5-4420-951f-859efc86a70a/galera/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.307298 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0e37cf7b-6c4e-44c5-8193-38a0888efeee/openstackclient/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.386557 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-576fl_65c370a7-5d69-437a-98d2-810e97b9a5b7/ovn-controller/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.514432 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea/nova-metadata-metadata/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.572846 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bnq8b_80b3f55d-b10e-40f1-9d45-4ed801491f54/openstack-network-exporter/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.754230 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovsdb-server-init/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.914260 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovsdb-server-init/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.943829 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovs-vswitchd/0.log" Nov 22 10:15:48 crc kubenswrapper[4846]: I1122 10:15:48.992865 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovsdb-server/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.036268 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:15:49 crc kubenswrapper[4846]: E1122 10:15:49.036536 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.156070 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rfbkn_d326c85b-6234-469b-b6f4-8a4d72b62dab/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.230327 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa80bcbe-b4a6-4515-b366-9ba9b0d92440/openstack-network-exporter/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.306614 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa80bcbe-b4a6-4515-b366-9ba9b0d92440/ovn-northd/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.432806 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5c5e879-a8c6-4758-a577-00d371164c9d/ovsdbserver-nb/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.443855 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5c5e879-a8c6-4758-a577-00d371164c9d/openstack-network-exporter/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.581095 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aff4ba43-41a2-420b-8f89-99c69c1f3cfc/openstack-network-exporter/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.683479 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aff4ba43-41a2-420b-8f89-99c69c1f3cfc/ovsdbserver-sb/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.842129 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-569956d6b4-jtk8r_2f8c4b78-83b6-4f98-a4e2-ef7f56043775/placement-api/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.901424 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-569956d6b4-jtk8r_2f8c4b78-83b6-4f98-a4e2-ef7f56043775/placement-log/0.log" Nov 22 10:15:49 crc kubenswrapper[4846]: I1122 10:15:49.921102 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_812351d5-d992-4243-94c9-3328217b37b9/setup-container/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.154625 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_812351d5-d992-4243-94c9-3328217b37b9/setup-container/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.156453 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_812351d5-d992-4243-94c9-3328217b37b9/rabbitmq/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.235688 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b44e9aa-f202-48be-bace-279f29824c1b/setup-container/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.512792 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b44e9aa-f202-48be-bace-279f29824c1b/setup-container/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.522211 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b44e9aa-f202-48be-bace-279f29824c1b/rabbitmq/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.576941 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg_0364c9c7-ad57-4109-bdf0-9c888a609515/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.765666 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn_2565b5ab-c381-4a01-bc51-98d00dc7ce25/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.834400 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-dmvg6_da832f40-8579-415e-82c8-3e66684eb241/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:50 crc kubenswrapper[4846]: I1122 10:15:50.958866 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gkqr4_6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.093587 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t6fpn_cf943f33-8c4e-4195-aa85-c1f60841b9ab/ssh-known-hosts-edpm-deployment/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.334627 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-86d575f679-k6l72_52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2/proxy-server/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.364016 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-86d575f679-k6l72_52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2/proxy-httpd/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.398093 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hgdvt_6f537097-bfac-4915-833f-ee9a52e7d8a5/swift-ring-rebalance/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.525529 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-auditor/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.608306 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-reaper/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.643358 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-replicator/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.799937 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-server/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.808576 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-auditor/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.834576 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-server/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.837892 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-replicator/0.log" Nov 22 10:15:51 crc kubenswrapper[4846]: I1122 10:15:51.982267 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-updater/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.046565 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-expirer/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.056204 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-auditor/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.058570 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-replicator/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.162555 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-server/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.218748 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-updater/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.260390 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/rsync/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.312181 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/swift-recon-cron/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.459541 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm_7fa86a5b-2dbc-4e12-bf49-ea58d02854b0/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.586264 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0746377b-0ff5-4289-b4b6-1e9c3a166533/tempest-tests-tempest-tests-runner/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.754774 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_423242c9-5d5f-4a1d-83db-13989d8d78b1/test-operator-logs-container/0.log" Nov 22 10:15:52 crc kubenswrapper[4846]: I1122 10:15:52.785279 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz_cfeec82e-6d58-4819-8715-7d0febbe480c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:16:02 crc kubenswrapper[4846]: I1122 10:16:02.313664 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4d174fc1-bcf2-4812-9766-875d3ca3efe5/memcached/0.log" Nov 22 10:16:03 crc kubenswrapper[4846]: I1122 10:16:03.035233 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:16:03 crc kubenswrapper[4846]: E1122 10:16:03.035555 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:16:16 crc kubenswrapper[4846]: I1122 10:16:16.484068 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 10:16:17 crc kubenswrapper[4846]: I1122 10:16:17.036008 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:16:17 crc kubenswrapper[4846]: E1122 10:16:17.037495 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:16:18 crc kubenswrapper[4846]: I1122 10:16:18.723324 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/util/0.log" Nov 22 10:16:18 crc kubenswrapper[4846]: I1122 10:16:18.895093 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/util/0.log" Nov 22 10:16:18 crc kubenswrapper[4846]: I1122 10:16:18.946967 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/pull/0.log" Nov 22 10:16:18 crc kubenswrapper[4846]: I1122 10:16:18.952726 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/pull/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.117999 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/util/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.171331 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/pull/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.198584 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/extract/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.334429 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcqgd_01860b24-58b0-422d-a390-fc783a2f4990/kube-rbac-proxy/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.409183 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcqgd_01860b24-58b0-422d-a390-fc783a2f4990/manager/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.452330 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-r4xp2_371bad3e-fcc3-42c5-a563-fc7d6aa5f275/kube-rbac-proxy/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.559093 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-r4xp2_371bad3e-fcc3-42c5-a563-fc7d6aa5f275/manager/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.629264 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-rb6cp_f1008fc2-d21a-4775-8505-12116c0a1d94/kube-rbac-proxy/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.632105 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-rb6cp_f1008fc2-d21a-4775-8505-12116c0a1d94/manager/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.766947 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-fwlcn_4dac3679-62ae-408f-b3ba-1809daaceb47/kube-rbac-proxy/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.906981 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-fwlcn_4dac3679-62ae-408f-b3ba-1809daaceb47/manager/0.log" Nov 22 10:16:19 crc kubenswrapper[4846]: I1122 10:16:19.983611 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-gqktx_539f5169-bf3b-4c3c-828a-8490d4d758d8/kube-rbac-proxy/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.030184 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-gqktx_539f5169-bf3b-4c3c-828a-8490d4d758d8/manager/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.090834 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-vt8xd_c4abfa7d-5927-41f1-af53-bc1ea6878bc1/kube-rbac-proxy/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.193271 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-vt8xd_c4abfa7d-5927-41f1-af53-bc1ea6878bc1/manager/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.286995 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6ccc968f7b-dxpcq_f7cb339f-9ebe-441d-ae17-43ad2ce13201/kube-rbac-proxy/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.462065 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6ccc968f7b-dxpcq_f7cb339f-9ebe-441d-ae17-43ad2ce13201/manager/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.495470 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-9wwm2_22e226e0-ebce-4d63-9379-109fe06b88da/manager/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.515271 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-9wwm2_22e226e0-ebce-4d63-9379-109fe06b88da/kube-rbac-proxy/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.659575 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-ltkkr_ab7af809-056a-45c1-bdd0-5e4a8bea02ef/kube-rbac-proxy/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.778748 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-ltkkr_ab7af809-056a-45c1-bdd0-5e4a8bea02ef/manager/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.804191 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-c8ssm_e84b7960-5cd2-4557-9b3c-a98ed4784006/kube-rbac-proxy/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.864436 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-c8ssm_e84b7960-5cd2-4557-9b3c-a98ed4784006/manager/0.log" Nov 22 10:16:20 crc kubenswrapper[4846]: I1122 10:16:20.975246 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-jbjc4_f4a50a36-b951-4342-b092-c94bea3d860e/kube-rbac-proxy/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.039262 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-jbjc4_f4a50a36-b951-4342-b092-c94bea3d860e/manager/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.170439 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-jwx4v_07d22ef0-2712-4daf-a620-081fee41f68f/kube-rbac-proxy/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.223737 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-jwx4v_07d22ef0-2712-4daf-a620-081fee41f68f/manager/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.290232 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-t8nfb_766c68ab-9022-4efd-84a3-af4aedf7d7b2/kube-rbac-proxy/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.421129 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-9jbkf_facf2ae5-028f-4413-a2d6-e503489ae5f3/kube-rbac-proxy/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.456813 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-t8nfb_766c68ab-9022-4efd-84a3-af4aedf7d7b2/manager/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.500271 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-9jbkf_facf2ae5-028f-4413-a2d6-e503489ae5f3/manager/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.657230 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-thll2_eaacbd1d-48b7-40a3-b7e4-48fc074e37fb/kube-rbac-proxy/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.702701 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-thll2_eaacbd1d-48b7-40a3-b7e4-48fc074e37fb/manager/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.900829 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67485f68cb-z25cl_7fa6485e-01f3-43e7-ac4e-f639cd3983d5/kube-rbac-proxy/0.log" Nov 22 10:16:21 crc kubenswrapper[4846]: I1122 10:16:21.956590 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-559dfbff4-8cpxr_d1b0081a-5f33-484d-8250-9ec2ab872b64/kube-rbac-proxy/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.194512 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bh6tp_562d6113-df0b-4993-b7b8-1cace4f13fe0/registry-server/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.316472 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-559dfbff4-8cpxr_d1b0081a-5f33-484d-8250-9ec2ab872b64/operator/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.429170 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-458hx_5454b9eb-3a18-47d6-ba8e-1b7230659b26/kube-rbac-proxy/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.630029 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-458hx_5454b9eb-3a18-47d6-ba8e-1b7230659b26/manager/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.632459 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-7lg8k_671de1b8-d3f3-4a1e-8572-e2840bf58e17/kube-rbac-proxy/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.681708 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-7lg8k_671de1b8-d3f3-4a1e-8572-e2840bf58e17/manager/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.838179 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7v42k_f6c75a18-6338-4da3-8b61-a973a8589e66/operator/0.log" Nov 22 10:16:22 crc kubenswrapper[4846]: I1122 10:16:22.975570 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67485f68cb-z25cl_7fa6485e-01f3-43e7-ac4e-f639cd3983d5/manager/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.007962 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-cnhgp_113bd687-6dff-4159-b034-3a27a0683260/kube-rbac-proxy/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.066524 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-cnhgp_113bd687-6dff-4159-b034-3a27a0683260/manager/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.131024 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-qbwl5_63f03060-74f5-437f-bb06-a2626c791a06/kube-rbac-proxy/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.254154 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-qbwl5_63f03060-74f5-437f-bb06-a2626c791a06/manager/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.298076 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-644f7_8cec188d-f264-4a62-96f1-93e309820fe6/kube-rbac-proxy/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.377555 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-644f7_8cec188d-f264-4a62-96f1-93e309820fe6/manager/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.421223 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-7f7qj_895d9e5d-08de-4611-a844-c2db9e8e1839/kube-rbac-proxy/0.log" Nov 22 10:16:23 crc kubenswrapper[4846]: I1122 10:16:23.461525 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-7f7qj_895d9e5d-08de-4611-a844-c2db9e8e1839/manager/0.log" Nov 22 10:16:29 crc kubenswrapper[4846]: I1122 10:16:29.035523 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:16:29 crc kubenswrapper[4846]: E1122 10:16:29.036508 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:16:41 crc kubenswrapper[4846]: I1122 10:16:41.807337 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-b95xr_ee403130-f909-4216-a9ff-8a4cb41d4017/control-plane-machine-set-operator/0.log" Nov 22 10:16:41 crc kubenswrapper[4846]: I1122 10:16:41.814861 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lz8p8_01e5ec75-28e3-4baa-8501-cbe8c740ec3f/kube-rbac-proxy/0.log" Nov 22 10:16:41 crc kubenswrapper[4846]: I1122 10:16:41.959189 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lz8p8_01e5ec75-28e3-4baa-8501-cbe8c740ec3f/machine-api-operator/0.log" Nov 22 10:16:42 crc kubenswrapper[4846]: I1122 10:16:42.035942 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:16:42 crc kubenswrapper[4846]: E1122 10:16:42.036356 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:16:53 crc kubenswrapper[4846]: I1122 10:16:53.036338 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:16:53 crc kubenswrapper[4846]: E1122 10:16:53.037600 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:16:55 crc kubenswrapper[4846]: I1122 10:16:55.327422 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nb9wl_1d9cea2b-9f89-437e-a0f3-875b123a47d3/cert-manager-controller/0.log" Nov 22 10:16:55 crc kubenswrapper[4846]: I1122 10:16:55.465173 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d4jkh_c55358d6-9876-4e6a-9b06-08db6080a803/cert-manager-cainjector/0.log" Nov 22 10:16:55 crc kubenswrapper[4846]: I1122 10:16:55.534447 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mppx8_27cc0714-ab99-4ddc-9e9c-66f24bba9fac/cert-manager-webhook/0.log" Nov 22 10:17:07 crc kubenswrapper[4846]: I1122 10:17:07.764633 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-zplwx_ea454b74-e77b-4f90-8311-563ab0e66191/nmstate-console-plugin/0.log" Nov 22 10:17:07 crc kubenswrapper[4846]: I1122 10:17:07.926373 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-bkbgk_9173cda0-1bab-4e52-96e3-4e3c564b846f/kube-rbac-proxy/0.log" Nov 22 10:17:07 crc kubenswrapper[4846]: I1122 10:17:07.944724 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lpsrm_dd1e7111-d57d-44c4-bcdb-7045dc626f01/nmstate-handler/0.log" Nov 22 10:17:07 crc kubenswrapper[4846]: I1122 10:17:07.985487 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-bkbgk_9173cda0-1bab-4e52-96e3-4e3c564b846f/nmstate-metrics/0.log" Nov 22 10:17:08 crc kubenswrapper[4846]: I1122 10:17:08.036381 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:17:08 crc kubenswrapper[4846]: E1122 10:17:08.036904 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:17:08 crc kubenswrapper[4846]: I1122 10:17:08.135924 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-629pv_1fecb21a-594d-4e4f-a063-37cbf0e0d5ea/nmstate-operator/0.log" Nov 22 10:17:08 crc kubenswrapper[4846]: I1122 10:17:08.159374 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-55lsz_c6f850af-f692-4fa4-b289-1fd426f79090/nmstate-webhook/0.log" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.211372 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9kp2q"] Nov 22 10:17:16 crc kubenswrapper[4846]: E1122 10:17:16.212780 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0ba0f0-5a2c-4fef-8452-d837ba04b355" containerName="container-00" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.215350 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0ba0f0-5a2c-4fef-8452-d837ba04b355" containerName="container-00" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.216646 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0ba0f0-5a2c-4fef-8452-d837ba04b355" containerName="container-00" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.238729 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.246055 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kp2q"] Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.344027 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdh7x\" (UniqueName: \"kubernetes.io/projected/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-kube-api-access-pdh7x\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.344129 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-catalog-content\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.344594 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-utilities\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.446219 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-catalog-content\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.446427 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-utilities\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.446479 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdh7x\" (UniqueName: \"kubernetes.io/projected/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-kube-api-access-pdh7x\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.446797 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-catalog-content\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.447008 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-utilities\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.471483 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdh7x\" (UniqueName: \"kubernetes.io/projected/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-kube-api-access-pdh7x\") pod \"certified-operators-9kp2q\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:16 crc kubenswrapper[4846]: I1122 10:17:16.562626 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:17 crc kubenswrapper[4846]: I1122 10:17:17.085068 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kp2q"] Nov 22 10:17:17 crc kubenswrapper[4846]: I1122 10:17:17.145757 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerStarted","Data":"e7de6cc126e9b7a79cf37e38ffd26153a6ae249e8d9bf0aa2253c44b5b9169d3"} Nov 22 10:17:18 crc kubenswrapper[4846]: I1122 10:17:18.178997 4846 generic.go:334] "Generic (PLEG): container finished" podID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerID="b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d" exitCode=0 Nov 22 10:17:18 crc kubenswrapper[4846]: I1122 10:17:18.179204 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerDied","Data":"b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d"} Nov 22 10:17:18 crc kubenswrapper[4846]: I1122 10:17:18.181332 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:17:19 crc kubenswrapper[4846]: I1122 10:17:19.192004 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerStarted","Data":"6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b"} Nov 22 10:17:20 crc kubenswrapper[4846]: I1122 10:17:20.203929 4846 generic.go:334] "Generic (PLEG): container finished" podID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerID="6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b" exitCode=0 Nov 22 10:17:20 crc kubenswrapper[4846]: I1122 10:17:20.203997 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerDied","Data":"6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b"} Nov 22 10:17:21 crc kubenswrapper[4846]: I1122 10:17:21.036037 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:17:21 crc kubenswrapper[4846]: E1122 10:17:21.036879 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:17:21 crc kubenswrapper[4846]: I1122 10:17:21.214008 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerStarted","Data":"3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b"} Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.516364 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-z5dqq_425496ff-38a1-4d67-b702-9bb864465158/kube-rbac-proxy/0.log" Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.607850 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-z5dqq_425496ff-38a1-4d67-b702-9bb864465158/controller/0.log" Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.732998 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.852702 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.864602 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.865714 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:17:23 crc kubenswrapper[4846]: I1122 10:17:23.938439 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.198109 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.271856 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.289170 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.304101 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.432510 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.432799 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.454772 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.509813 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/controller/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.613958 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/frr-metrics/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.648988 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/kube-rbac-proxy/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.715712 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/kube-rbac-proxy-frr/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.805554 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/reloader/0.log" Nov 22 10:17:24 crc kubenswrapper[4846]: I1122 10:17:24.988297 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-g4nfr_ae71f435-af46-44ac-afdb-57dea9cd1925/frr-k8s-webhook-server/0.log" Nov 22 10:17:25 crc kubenswrapper[4846]: I1122 10:17:25.140332 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b9465489d-lwlfq_a9a28c92-48ff-4026-819b-70068881c12b/manager/0.log" Nov 22 10:17:25 crc kubenswrapper[4846]: I1122 10:17:25.275135 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57ff77b6c8-sd485_2c8911b3-3f77-4666-822f-e40c1100c67f/webhook-server/0.log" Nov 22 10:17:25 crc kubenswrapper[4846]: I1122 10:17:25.441808 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtgbm_b4e18041-980a-4cbb-ba17-98b3f6032c57/kube-rbac-proxy/0.log" Nov 22 10:17:25 crc kubenswrapper[4846]: I1122 10:17:25.939845 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtgbm_b4e18041-980a-4cbb-ba17-98b3f6032c57/speaker/0.log" Nov 22 10:17:26 crc kubenswrapper[4846]: I1122 10:17:26.029878 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/frr/0.log" Nov 22 10:17:26 crc kubenswrapper[4846]: I1122 10:17:26.562912 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:26 crc kubenswrapper[4846]: I1122 10:17:26.563375 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:26 crc kubenswrapper[4846]: I1122 10:17:26.611230 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:26 crc kubenswrapper[4846]: I1122 10:17:26.628786 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9kp2q" podStartSLOduration=8.19824766 podStartE2EDuration="10.62877091s" podCreationTimestamp="2025-11-22 10:17:16 +0000 UTC" firstStartedPulling="2025-11-22 10:17:18.180855981 +0000 UTC m=+3813.116545670" lastFinishedPulling="2025-11-22 10:17:20.611379271 +0000 UTC m=+3815.547068920" observedRunningTime="2025-11-22 10:17:21.240825858 +0000 UTC m=+3816.176515507" watchObservedRunningTime="2025-11-22 10:17:26.62877091 +0000 UTC m=+3821.564460559" Nov 22 10:17:27 crc kubenswrapper[4846]: I1122 10:17:27.309059 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:27 crc kubenswrapper[4846]: I1122 10:17:27.356185 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kp2q"] Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.276720 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9kp2q" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="registry-server" containerID="cri-o://3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b" gracePeriod=2 Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.754176 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.901785 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-catalog-content\") pod \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.902173 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdh7x\" (UniqueName: \"kubernetes.io/projected/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-kube-api-access-pdh7x\") pod \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.902323 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-utilities\") pod \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\" (UID: \"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8\") " Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.903993 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-utilities" (OuterVolumeSpecName: "utilities") pod "50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" (UID: "50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.909444 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-kube-api-access-pdh7x" (OuterVolumeSpecName: "kube-api-access-pdh7x") pod "50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" (UID: "50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8"). InnerVolumeSpecName "kube-api-access-pdh7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:17:29 crc kubenswrapper[4846]: I1122 10:17:29.966217 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" (UID: "50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.004210 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.004552 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.004718 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdh7x\" (UniqueName: \"kubernetes.io/projected/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8-kube-api-access-pdh7x\") on node \"crc\" DevicePath \"\"" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.287908 4846 generic.go:334] "Generic (PLEG): container finished" podID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerID="3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b" exitCode=0 Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.287957 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerDied","Data":"3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b"} Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.287981 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kp2q" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.288002 4846 scope.go:117] "RemoveContainer" containerID="3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.287988 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kp2q" event={"ID":"50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8","Type":"ContainerDied","Data":"e7de6cc126e9b7a79cf37e38ffd26153a6ae249e8d9bf0aa2253c44b5b9169d3"} Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.317471 4846 scope.go:117] "RemoveContainer" containerID="6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.319660 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kp2q"] Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.328648 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9kp2q"] Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.343380 4846 scope.go:117] "RemoveContainer" containerID="b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.382267 4846 scope.go:117] "RemoveContainer" containerID="3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b" Nov 22 10:17:30 crc kubenswrapper[4846]: E1122 10:17:30.382792 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b\": container with ID starting with 3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b not found: ID does not exist" containerID="3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.382835 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b"} err="failed to get container status \"3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b\": rpc error: code = NotFound desc = could not find container \"3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b\": container with ID starting with 3e9e82db4398dd95afa0156140c7b8d45cc91d95e3846a0ae764b6b959e26e7b not found: ID does not exist" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.382858 4846 scope.go:117] "RemoveContainer" containerID="6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b" Nov 22 10:17:30 crc kubenswrapper[4846]: E1122 10:17:30.383266 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b\": container with ID starting with 6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b not found: ID does not exist" containerID="6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.383287 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b"} err="failed to get container status \"6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b\": rpc error: code = NotFound desc = could not find container \"6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b\": container with ID starting with 6e3140bccef6d8b632f100c491ed8fa66912dbf298ae535c06d2acd3e1520e5b not found: ID does not exist" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.383308 4846 scope.go:117] "RemoveContainer" containerID="b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d" Nov 22 10:17:30 crc kubenswrapper[4846]: E1122 10:17:30.383767 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d\": container with ID starting with b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d not found: ID does not exist" containerID="b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d" Nov 22 10:17:30 crc kubenswrapper[4846]: I1122 10:17:30.383837 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d"} err="failed to get container status \"b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d\": rpc error: code = NotFound desc = could not find container \"b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d\": container with ID starting with b0d2cda35e2a5ed2cb208df004f37458fa712d281c673d272dd2d7aaaaba3c9d not found: ID does not exist" Nov 22 10:17:32 crc kubenswrapper[4846]: I1122 10:17:32.045735 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" path="/var/lib/kubelet/pods/50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8/volumes" Nov 22 10:17:34 crc kubenswrapper[4846]: I1122 10:17:34.036560 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:17:34 crc kubenswrapper[4846]: E1122 10:17:34.037929 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.144642 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/util/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.271850 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/util/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.311404 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/pull/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.337262 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/pull/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.553134 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/pull/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.555873 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/util/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.557398 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/extract/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.708781 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-utilities/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.934271 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-utilities/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.938166 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-content/0.log" Nov 22 10:17:39 crc kubenswrapper[4846]: I1122 10:17:39.949168 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-content/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.092227 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-utilities/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.100889 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-content/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.319749 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-utilities/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.564757 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-content/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.596211 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-utilities/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.665536 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-content/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.765517 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/registry-server/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.846534 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-utilities/0.log" Nov 22 10:17:40 crc kubenswrapper[4846]: I1122 10:17:40.857127 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-content/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.013089 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/util/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.291796 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/pull/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.304509 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/pull/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.319876 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/util/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.474671 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/registry-server/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.548177 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/extract/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.566278 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/util/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.613917 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/pull/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.722911 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2lfzs_b2d91bbe-e29e-4a12-a7a8-92c26c4a977b/marketplace-operator/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.794947 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-utilities/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.990177 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-utilities/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.990187 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-content/0.log" Nov 22 10:17:41 crc kubenswrapper[4846]: I1122 10:17:41.991514 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-content/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.136780 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-content/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.164298 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-utilities/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.257322 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/registry-server/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.320337 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-utilities/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.527728 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-content/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.541863 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-content/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.546861 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-utilities/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.724973 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-content/0.log" Nov 22 10:17:42 crc kubenswrapper[4846]: I1122 10:17:42.731247 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-utilities/0.log" Nov 22 10:17:43 crc kubenswrapper[4846]: I1122 10:17:43.267070 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/registry-server/0.log" Nov 22 10:17:45 crc kubenswrapper[4846]: I1122 10:17:45.035761 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:17:45 crc kubenswrapper[4846]: E1122 10:17:45.036631 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:17:58 crc kubenswrapper[4846]: I1122 10:17:58.035987 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:17:58 crc kubenswrapper[4846]: E1122 10:17:58.036778 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:18:09 crc kubenswrapper[4846]: I1122 10:18:09.035231 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:18:09 crc kubenswrapper[4846]: E1122 10:18:09.035913 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:18:24 crc kubenswrapper[4846]: I1122 10:18:24.035636 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:18:24 crc kubenswrapper[4846]: E1122 10:18:24.036219 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:18:36 crc kubenswrapper[4846]: I1122 10:18:36.041387 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:18:36 crc kubenswrapper[4846]: E1122 10:18:36.042342 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:18:50 crc kubenswrapper[4846]: I1122 10:18:50.035797 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:18:50 crc kubenswrapper[4846]: E1122 10:18:50.037242 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:19:01 crc kubenswrapper[4846]: I1122 10:19:01.036384 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:19:01 crc kubenswrapper[4846]: E1122 10:19:01.037156 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:19:12 crc kubenswrapper[4846]: I1122 10:19:12.035392 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:19:12 crc kubenswrapper[4846]: E1122 10:19:12.036663 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:19:20 crc kubenswrapper[4846]: I1122 10:19:20.409706 4846 generic.go:334] "Generic (PLEG): container finished" podID="e321743e-6fcf-4416-aadd-078013511625" containerID="1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd" exitCode=0 Nov 22 10:19:20 crc kubenswrapper[4846]: I1122 10:19:20.410283 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c5szc/must-gather-d7tx8" event={"ID":"e321743e-6fcf-4416-aadd-078013511625","Type":"ContainerDied","Data":"1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd"} Nov 22 10:19:20 crc kubenswrapper[4846]: I1122 10:19:20.411763 4846 scope.go:117] "RemoveContainer" containerID="1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd" Nov 22 10:19:21 crc kubenswrapper[4846]: I1122 10:19:21.254558 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c5szc_must-gather-d7tx8_e321743e-6fcf-4416-aadd-078013511625/gather/0.log" Nov 22 10:19:24 crc kubenswrapper[4846]: I1122 10:19:24.035529 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:19:24 crc kubenswrapper[4846]: E1122 10:19:24.036527 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:19:28 crc kubenswrapper[4846]: I1122 10:19:28.422106 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c5szc/must-gather-d7tx8"] Nov 22 10:19:28 crc kubenswrapper[4846]: I1122 10:19:28.422875 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-c5szc/must-gather-d7tx8" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="copy" containerID="cri-o://e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd" gracePeriod=2 Nov 22 10:19:28 crc kubenswrapper[4846]: I1122 10:19:28.431769 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c5szc/must-gather-d7tx8"] Nov 22 10:19:28 crc kubenswrapper[4846]: I1122 10:19:28.925553 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c5szc_must-gather-d7tx8_e321743e-6fcf-4416-aadd-078013511625/copy/0.log" Nov 22 10:19:28 crc kubenswrapper[4846]: I1122 10:19:28.926624 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.011721 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksxd4\" (UniqueName: \"kubernetes.io/projected/e321743e-6fcf-4416-aadd-078013511625-kube-api-access-ksxd4\") pod \"e321743e-6fcf-4416-aadd-078013511625\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.011792 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e321743e-6fcf-4416-aadd-078013511625-must-gather-output\") pod \"e321743e-6fcf-4416-aadd-078013511625\" (UID: \"e321743e-6fcf-4416-aadd-078013511625\") " Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.018863 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e321743e-6fcf-4416-aadd-078013511625-kube-api-access-ksxd4" (OuterVolumeSpecName: "kube-api-access-ksxd4") pod "e321743e-6fcf-4416-aadd-078013511625" (UID: "e321743e-6fcf-4416-aadd-078013511625"). InnerVolumeSpecName "kube-api-access-ksxd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.114172 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksxd4\" (UniqueName: \"kubernetes.io/projected/e321743e-6fcf-4416-aadd-078013511625-kube-api-access-ksxd4\") on node \"crc\" DevicePath \"\"" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.211796 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e321743e-6fcf-4416-aadd-078013511625-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e321743e-6fcf-4416-aadd-078013511625" (UID: "e321743e-6fcf-4416-aadd-078013511625"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.215887 4846 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e321743e-6fcf-4416-aadd-078013511625-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.524667 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c5szc_must-gather-d7tx8_e321743e-6fcf-4416-aadd-078013511625/copy/0.log" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.525097 4846 generic.go:334] "Generic (PLEG): container finished" podID="e321743e-6fcf-4416-aadd-078013511625" containerID="e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd" exitCode=143 Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.525150 4846 scope.go:117] "RemoveContainer" containerID="e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.525193 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c5szc/must-gather-d7tx8" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.551440 4846 scope.go:117] "RemoveContainer" containerID="1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.597451 4846 scope.go:117] "RemoveContainer" containerID="e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd" Nov 22 10:19:29 crc kubenswrapper[4846]: E1122 10:19:29.598338 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd\": container with ID starting with e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd not found: ID does not exist" containerID="e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.598387 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd"} err="failed to get container status \"e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd\": rpc error: code = NotFound desc = could not find container \"e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd\": container with ID starting with e10684602a5be7f537fde09ce908c702101dc38e6b3ef8a03e1e390ad34a4afd not found: ID does not exist" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.598422 4846 scope.go:117] "RemoveContainer" containerID="1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd" Nov 22 10:19:29 crc kubenswrapper[4846]: E1122 10:19:29.598775 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd\": container with ID starting with 1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd not found: ID does not exist" containerID="1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd" Nov 22 10:19:29 crc kubenswrapper[4846]: I1122 10:19:29.598816 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd"} err="failed to get container status \"1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd\": rpc error: code = NotFound desc = could not find container \"1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd\": container with ID starting with 1646f637c5a8038c87c464a89a0a61a280c9b0fe34e4c740ad05f72af0a1e5cd not found: ID does not exist" Nov 22 10:19:30 crc kubenswrapper[4846]: I1122 10:19:30.045805 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e321743e-6fcf-4416-aadd-078013511625" path="/var/lib/kubelet/pods/e321743e-6fcf-4416-aadd-078013511625/volumes" Nov 22 10:19:38 crc kubenswrapper[4846]: I1122 10:19:38.035567 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:19:38 crc kubenswrapper[4846]: E1122 10:19:38.036416 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:19:49 crc kubenswrapper[4846]: I1122 10:19:49.037233 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:19:49 crc kubenswrapper[4846]: E1122 10:19:49.038745 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:20:01 crc kubenswrapper[4846]: I1122 10:20:01.035624 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:20:02 crc kubenswrapper[4846]: I1122 10:20:02.205118 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"e3447fd2376a9e62f224b7e26d25446dcd902e6769cfc416d5b85222bd3cdb68"} Nov 22 10:21:27 crc kubenswrapper[4846]: I1122 10:21:27.204887 4846 scope.go:117] "RemoveContainer" containerID="0aa48728a1bc5805aa027f708a954a7c2f4c348c3eb5c0298946b6c915a2a302" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.912823 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7jdhv"] Nov 22 10:21:44 crc kubenswrapper[4846]: E1122 10:21:44.913771 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="extract-content" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.913786 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="extract-content" Nov 22 10:21:44 crc kubenswrapper[4846]: E1122 10:21:44.913803 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="registry-server" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.913810 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="registry-server" Nov 22 10:21:44 crc kubenswrapper[4846]: E1122 10:21:44.913824 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="copy" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.913833 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="copy" Nov 22 10:21:44 crc kubenswrapper[4846]: E1122 10:21:44.913854 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="gather" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.913861 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="gather" Nov 22 10:21:44 crc kubenswrapper[4846]: E1122 10:21:44.913884 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="extract-utilities" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.913892 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="extract-utilities" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.914124 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="copy" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.914141 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="e321743e-6fcf-4416-aadd-078013511625" containerName="gather" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.914164 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d6a85f-1ce3-4564-a1c3-fdd5509f3fa8" containerName="registry-server" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.915441 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:44 crc kubenswrapper[4846]: I1122 10:21:44.939998 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jdhv"] Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.052350 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-catalog-content\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.052427 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-utilities\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.052539 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjr6\" (UniqueName: \"kubernetes.io/projected/f3972120-7c64-4a58-b159-62e07775bfd6-kube-api-access-9jjr6\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.111162 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-859b2"] Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.114536 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.121908 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-859b2"] Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.154818 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-catalog-content\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.154954 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-utilities\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.155090 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjr6\" (UniqueName: \"kubernetes.io/projected/f3972120-7c64-4a58-b159-62e07775bfd6-kube-api-access-9jjr6\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.155342 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-catalog-content\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.156470 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-utilities\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.257997 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-utilities\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.258117 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-catalog-content\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.258199 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4cj\" (UniqueName: \"kubernetes.io/projected/53d0a755-7470-4c33-9a5e-535c12ba1463-kube-api-access-rh4cj\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.280324 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjr6\" (UniqueName: \"kubernetes.io/projected/f3972120-7c64-4a58-b159-62e07775bfd6-kube-api-access-9jjr6\") pod \"redhat-marketplace-7jdhv\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.359376 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-utilities\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.359426 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-catalog-content\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.359505 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4cj\" (UniqueName: \"kubernetes.io/projected/53d0a755-7470-4c33-9a5e-535c12ba1463-kube-api-access-rh4cj\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.360476 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-utilities\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.360571 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-catalog-content\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.377947 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4cj\" (UniqueName: \"kubernetes.io/projected/53d0a755-7470-4c33-9a5e-535c12ba1463-kube-api-access-rh4cj\") pod \"community-operators-859b2\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.445981 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:45 crc kubenswrapper[4846]: I1122 10:21:45.545768 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.082893 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-859b2"] Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.097741 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jdhv"] Nov 22 10:21:46 crc kubenswrapper[4846]: W1122 10:21:46.099060 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d0a755_7470_4c33_9a5e_535c12ba1463.slice/crio-74070e6a713a843aedfbbcc7022f4d40129fd13ccbe0742ca2c9094a56a25936 WatchSource:0}: Error finding container 74070e6a713a843aedfbbcc7022f4d40129fd13ccbe0742ca2c9094a56a25936: Status 404 returned error can't find the container with id 74070e6a713a843aedfbbcc7022f4d40129fd13ccbe0742ca2c9094a56a25936 Nov 22 10:21:46 crc kubenswrapper[4846]: W1122 10:21:46.103986 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3972120_7c64_4a58_b159_62e07775bfd6.slice/crio-4ffd71f7e92917b32558e3a262f761b3a9e7315e9ff454003f18c7519dac8a4d WatchSource:0}: Error finding container 4ffd71f7e92917b32558e3a262f761b3a9e7315e9ff454003f18c7519dac8a4d: Status 404 returned error can't find the container with id 4ffd71f7e92917b32558e3a262f761b3a9e7315e9ff454003f18c7519dac8a4d Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.370213 4846 generic.go:334] "Generic (PLEG): container finished" podID="f3972120-7c64-4a58-b159-62e07775bfd6" containerID="edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65" exitCode=0 Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.370317 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jdhv" event={"ID":"f3972120-7c64-4a58-b159-62e07775bfd6","Type":"ContainerDied","Data":"edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65"} Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.370366 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jdhv" event={"ID":"f3972120-7c64-4a58-b159-62e07775bfd6","Type":"ContainerStarted","Data":"4ffd71f7e92917b32558e3a262f761b3a9e7315e9ff454003f18c7519dac8a4d"} Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.375244 4846 generic.go:334] "Generic (PLEG): container finished" podID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerID="1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2" exitCode=0 Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.375290 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerDied","Data":"1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2"} Nov 22 10:21:46 crc kubenswrapper[4846]: I1122 10:21:46.375312 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerStarted","Data":"74070e6a713a843aedfbbcc7022f4d40129fd13ccbe0742ca2c9094a56a25936"} Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.400546 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerStarted","Data":"4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062"} Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.404459 4846 generic.go:334] "Generic (PLEG): container finished" podID="f3972120-7c64-4a58-b159-62e07775bfd6" containerID="9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65" exitCode=0 Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.404519 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jdhv" event={"ID":"f3972120-7c64-4a58-b159-62e07775bfd6","Type":"ContainerDied","Data":"9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65"} Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.522913 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4ghxc"] Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.525547 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.533258 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4ghxc"] Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.709956 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-utilities\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.710061 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-catalog-content\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.710083 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzvjd\" (UniqueName: \"kubernetes.io/projected/6eb1271e-ba1e-4592-a579-56210cfd2870-kube-api-access-nzvjd\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.812295 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-catalog-content\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.812337 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzvjd\" (UniqueName: \"kubernetes.io/projected/6eb1271e-ba1e-4592-a579-56210cfd2870-kube-api-access-nzvjd\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.812509 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-utilities\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.812693 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-catalog-content\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.813123 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-utilities\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.838966 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzvjd\" (UniqueName: \"kubernetes.io/projected/6eb1271e-ba1e-4592-a579-56210cfd2870-kube-api-access-nzvjd\") pod \"redhat-operators-4ghxc\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:47 crc kubenswrapper[4846]: I1122 10:21:47.848103 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:48 crc kubenswrapper[4846]: I1122 10:21:48.311100 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4ghxc"] Nov 22 10:21:48 crc kubenswrapper[4846]: W1122 10:21:48.317141 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb1271e_ba1e_4592_a579_56210cfd2870.slice/crio-24ff74d806848467a74346363c29f90514791e99a7149830b9e55e187a8a524e WatchSource:0}: Error finding container 24ff74d806848467a74346363c29f90514791e99a7149830b9e55e187a8a524e: Status 404 returned error can't find the container with id 24ff74d806848467a74346363c29f90514791e99a7149830b9e55e187a8a524e Nov 22 10:21:48 crc kubenswrapper[4846]: I1122 10:21:48.415186 4846 generic.go:334] "Generic (PLEG): container finished" podID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerID="4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062" exitCode=0 Nov 22 10:21:48 crc kubenswrapper[4846]: I1122 10:21:48.415488 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerDied","Data":"4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062"} Nov 22 10:21:48 crc kubenswrapper[4846]: I1122 10:21:48.426355 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jdhv" event={"ID":"f3972120-7c64-4a58-b159-62e07775bfd6","Type":"ContainerStarted","Data":"639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549"} Nov 22 10:21:48 crc kubenswrapper[4846]: I1122 10:21:48.443228 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerStarted","Data":"24ff74d806848467a74346363c29f90514791e99a7149830b9e55e187a8a524e"} Nov 22 10:21:48 crc kubenswrapper[4846]: I1122 10:21:48.477803 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7jdhv" podStartSLOduration=3.039444032 podStartE2EDuration="4.477783613s" podCreationTimestamp="2025-11-22 10:21:44 +0000 UTC" firstStartedPulling="2025-11-22 10:21:46.37196971 +0000 UTC m=+4081.307659359" lastFinishedPulling="2025-11-22 10:21:47.810309291 +0000 UTC m=+4082.745998940" observedRunningTime="2025-11-22 10:21:48.474302394 +0000 UTC m=+4083.409992043" watchObservedRunningTime="2025-11-22 10:21:48.477783613 +0000 UTC m=+4083.413473262" Nov 22 10:21:49 crc kubenswrapper[4846]: I1122 10:21:49.452515 4846 generic.go:334] "Generic (PLEG): container finished" podID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerID="a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf" exitCode=0 Nov 22 10:21:49 crc kubenswrapper[4846]: I1122 10:21:49.452640 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerDied","Data":"a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf"} Nov 22 10:21:49 crc kubenswrapper[4846]: I1122 10:21:49.455759 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerStarted","Data":"bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f"} Nov 22 10:21:49 crc kubenswrapper[4846]: I1122 10:21:49.510639 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-859b2" podStartSLOduration=1.836138896 podStartE2EDuration="4.5106191s" podCreationTimestamp="2025-11-22 10:21:45 +0000 UTC" firstStartedPulling="2025-11-22 10:21:46.377711043 +0000 UTC m=+4081.313400692" lastFinishedPulling="2025-11-22 10:21:49.052191237 +0000 UTC m=+4083.987880896" observedRunningTime="2025-11-22 10:21:49.504101154 +0000 UTC m=+4084.439790803" watchObservedRunningTime="2025-11-22 10:21:49.5106191 +0000 UTC m=+4084.446308759" Nov 22 10:21:50 crc kubenswrapper[4846]: I1122 10:21:50.477759 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerStarted","Data":"14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801"} Nov 22 10:21:51 crc kubenswrapper[4846]: I1122 10:21:51.491375 4846 generic.go:334] "Generic (PLEG): container finished" podID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerID="14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801" exitCode=0 Nov 22 10:21:51 crc kubenswrapper[4846]: I1122 10:21:51.491451 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerDied","Data":"14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801"} Nov 22 10:21:52 crc kubenswrapper[4846]: I1122 10:21:52.502487 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerStarted","Data":"2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a"} Nov 22 10:21:52 crc kubenswrapper[4846]: I1122 10:21:52.524447 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4ghxc" podStartSLOduration=3.059108119 podStartE2EDuration="5.524424553s" podCreationTimestamp="2025-11-22 10:21:47 +0000 UTC" firstStartedPulling="2025-11-22 10:21:49.454523247 +0000 UTC m=+4084.390212916" lastFinishedPulling="2025-11-22 10:21:51.919839691 +0000 UTC m=+4086.855529350" observedRunningTime="2025-11-22 10:21:52.519960976 +0000 UTC m=+4087.455650665" watchObservedRunningTime="2025-11-22 10:21:52.524424553 +0000 UTC m=+4087.460114202" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.446113 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.446549 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.515566 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.548299 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.548544 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.597514 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:55 crc kubenswrapper[4846]: I1122 10:21:55.610515 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:56 crc kubenswrapper[4846]: I1122 10:21:56.097588 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-859b2"] Nov 22 10:21:56 crc kubenswrapper[4846]: I1122 10:21:56.624927 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:21:57 crc kubenswrapper[4846]: I1122 10:21:57.545621 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-859b2" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="registry-server" containerID="cri-o://bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f" gracePeriod=2 Nov 22 10:21:57 crc kubenswrapper[4846]: I1122 10:21:57.848280 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:57 crc kubenswrapper[4846]: I1122 10:21:57.848645 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:21:57 crc kubenswrapper[4846]: I1122 10:21:57.895722 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jdhv"] Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.517357 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.558465 4846 generic.go:334] "Generic (PLEG): container finished" podID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerID="bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f" exitCode=0 Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.558528 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-859b2" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.558552 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerDied","Data":"bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f"} Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.558598 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-859b2" event={"ID":"53d0a755-7470-4c33-9a5e-535c12ba1463","Type":"ContainerDied","Data":"74070e6a713a843aedfbbcc7022f4d40129fd13ccbe0742ca2c9094a56a25936"} Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.558624 4846 scope.go:117] "RemoveContainer" containerID="bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.575576 4846 scope.go:117] "RemoveContainer" containerID="4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.598212 4846 scope.go:117] "RemoveContainer" containerID="1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.623660 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-catalog-content\") pod \"53d0a755-7470-4c33-9a5e-535c12ba1463\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.623844 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh4cj\" (UniqueName: \"kubernetes.io/projected/53d0a755-7470-4c33-9a5e-535c12ba1463-kube-api-access-rh4cj\") pod \"53d0a755-7470-4c33-9a5e-535c12ba1463\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.623878 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-utilities\") pod \"53d0a755-7470-4c33-9a5e-535c12ba1463\" (UID: \"53d0a755-7470-4c33-9a5e-535c12ba1463\") " Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.624737 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-utilities" (OuterVolumeSpecName: "utilities") pod "53d0a755-7470-4c33-9a5e-535c12ba1463" (UID: "53d0a755-7470-4c33-9a5e-535c12ba1463"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.624928 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.629931 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d0a755-7470-4c33-9a5e-535c12ba1463-kube-api-access-rh4cj" (OuterVolumeSpecName: "kube-api-access-rh4cj") pod "53d0a755-7470-4c33-9a5e-535c12ba1463" (UID: "53d0a755-7470-4c33-9a5e-535c12ba1463"). InnerVolumeSpecName "kube-api-access-rh4cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.681943 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53d0a755-7470-4c33-9a5e-535c12ba1463" (UID: "53d0a755-7470-4c33-9a5e-535c12ba1463"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.684366 4846 scope.go:117] "RemoveContainer" containerID="bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f" Nov 22 10:21:58 crc kubenswrapper[4846]: E1122 10:21:58.684788 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f\": container with ID starting with bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f not found: ID does not exist" containerID="bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.684852 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f"} err="failed to get container status \"bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f\": rpc error: code = NotFound desc = could not find container \"bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f\": container with ID starting with bafd3c49a2986605fa4290e494d7d995221c2d9c26569ca05adb3654d532e50f not found: ID does not exist" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.684908 4846 scope.go:117] "RemoveContainer" containerID="4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062" Nov 22 10:21:58 crc kubenswrapper[4846]: E1122 10:21:58.685264 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062\": container with ID starting with 4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062 not found: ID does not exist" containerID="4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.685311 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062"} err="failed to get container status \"4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062\": rpc error: code = NotFound desc = could not find container \"4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062\": container with ID starting with 4cda598d05e00172397bc4da4e93b1af8ac00c0e901958c0ffc480c4b823e062 not found: ID does not exist" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.685335 4846 scope.go:117] "RemoveContainer" containerID="1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2" Nov 22 10:21:58 crc kubenswrapper[4846]: E1122 10:21:58.685581 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2\": container with ID starting with 1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2 not found: ID does not exist" containerID="1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.685607 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2"} err="failed to get container status \"1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2\": rpc error: code = NotFound desc = could not find container \"1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2\": container with ID starting with 1f54abebea4a479367be4a07524c49a4732cf56ee37a5d4d65aab17530b9bbc2 not found: ID does not exist" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.727135 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh4cj\" (UniqueName: \"kubernetes.io/projected/53d0a755-7470-4c33-9a5e-535c12ba1463-kube-api-access-rh4cj\") on node \"crc\" DevicePath \"\"" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.727170 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53d0a755-7470-4c33-9a5e-535c12ba1463-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.908448 4846 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4ghxc" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="registry-server" probeResult="failure" output=< Nov 22 10:21:58 crc kubenswrapper[4846]: timeout: failed to connect service ":50051" within 1s Nov 22 10:21:58 crc kubenswrapper[4846]: > Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.932735 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-859b2"] Nov 22 10:21:58 crc kubenswrapper[4846]: I1122 10:21:58.947300 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-859b2"] Nov 22 10:21:59 crc kubenswrapper[4846]: I1122 10:21:59.567412 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7jdhv" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="registry-server" containerID="cri-o://639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549" gracePeriod=2 Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.061545 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" path="/var/lib/kubelet/pods/53d0a755-7470-4c33-9a5e-535c12ba1463/volumes" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.065067 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.152820 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-utilities\") pod \"f3972120-7c64-4a58-b159-62e07775bfd6\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.153181 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjr6\" (UniqueName: \"kubernetes.io/projected/f3972120-7c64-4a58-b159-62e07775bfd6-kube-api-access-9jjr6\") pod \"f3972120-7c64-4a58-b159-62e07775bfd6\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.153261 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-catalog-content\") pod \"f3972120-7c64-4a58-b159-62e07775bfd6\" (UID: \"f3972120-7c64-4a58-b159-62e07775bfd6\") " Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.156631 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-utilities" (OuterVolumeSpecName: "utilities") pod "f3972120-7c64-4a58-b159-62e07775bfd6" (UID: "f3972120-7c64-4a58-b159-62e07775bfd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.163170 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3972120-7c64-4a58-b159-62e07775bfd6-kube-api-access-9jjr6" (OuterVolumeSpecName: "kube-api-access-9jjr6") pod "f3972120-7c64-4a58-b159-62e07775bfd6" (UID: "f3972120-7c64-4a58-b159-62e07775bfd6"). InnerVolumeSpecName "kube-api-access-9jjr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.179589 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3972120-7c64-4a58-b159-62e07775bfd6" (UID: "f3972120-7c64-4a58-b159-62e07775bfd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.254357 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.254388 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjr6\" (UniqueName: \"kubernetes.io/projected/f3972120-7c64-4a58-b159-62e07775bfd6-kube-api-access-9jjr6\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.254397 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3972120-7c64-4a58-b159-62e07775bfd6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.585471 4846 generic.go:334] "Generic (PLEG): container finished" podID="f3972120-7c64-4a58-b159-62e07775bfd6" containerID="639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549" exitCode=0 Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.585552 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jdhv" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.585545 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jdhv" event={"ID":"f3972120-7c64-4a58-b159-62e07775bfd6","Type":"ContainerDied","Data":"639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549"} Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.585633 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jdhv" event={"ID":"f3972120-7c64-4a58-b159-62e07775bfd6","Type":"ContainerDied","Data":"4ffd71f7e92917b32558e3a262f761b3a9e7315e9ff454003f18c7519dac8a4d"} Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.585676 4846 scope.go:117] "RemoveContainer" containerID="639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.625805 4846 scope.go:117] "RemoveContainer" containerID="9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.640394 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jdhv"] Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.650344 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jdhv"] Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.652725 4846 scope.go:117] "RemoveContainer" containerID="edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.732801 4846 scope.go:117] "RemoveContainer" containerID="639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549" Nov 22 10:22:00 crc kubenswrapper[4846]: E1122 10:22:00.733905 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549\": container with ID starting with 639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549 not found: ID does not exist" containerID="639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.733960 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549"} err="failed to get container status \"639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549\": rpc error: code = NotFound desc = could not find container \"639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549\": container with ID starting with 639af5924bbc338e3837d77b10512cd0576c386cffa5c56a37f98561d5f2c549 not found: ID does not exist" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.733998 4846 scope.go:117] "RemoveContainer" containerID="9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65" Nov 22 10:22:00 crc kubenswrapper[4846]: E1122 10:22:00.734329 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65\": container with ID starting with 9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65 not found: ID does not exist" containerID="9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.734370 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65"} err="failed to get container status \"9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65\": rpc error: code = NotFound desc = could not find container \"9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65\": container with ID starting with 9251a449feb433fa190b00528d1287f79d423cd627b1332333b4caf702bf4d65 not found: ID does not exist" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.734398 4846 scope.go:117] "RemoveContainer" containerID="edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65" Nov 22 10:22:00 crc kubenswrapper[4846]: E1122 10:22:00.734645 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65\": container with ID starting with edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65 not found: ID does not exist" containerID="edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65" Nov 22 10:22:00 crc kubenswrapper[4846]: I1122 10:22:00.734680 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65"} err="failed to get container status \"edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65\": rpc error: code = NotFound desc = could not find container \"edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65\": container with ID starting with edbf2161057684138901b895fe9605da28c816357afe42cb826d01564e2f0a65 not found: ID does not exist" Nov 22 10:22:02 crc kubenswrapper[4846]: I1122 10:22:02.052397 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" path="/var/lib/kubelet/pods/f3972120-7c64-4a58-b159-62e07775bfd6/volumes" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.737938 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tk6n/must-gather-2g9sk"] Nov 22 10:22:06 crc kubenswrapper[4846]: E1122 10:22:06.739880 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="extract-utilities" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740143 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="extract-utilities" Nov 22 10:22:06 crc kubenswrapper[4846]: E1122 10:22:06.740195 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="extract-content" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740205 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="extract-content" Nov 22 10:22:06 crc kubenswrapper[4846]: E1122 10:22:06.740234 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="registry-server" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740243 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="registry-server" Nov 22 10:22:06 crc kubenswrapper[4846]: E1122 10:22:06.740278 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="extract-content" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740286 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="extract-content" Nov 22 10:22:06 crc kubenswrapper[4846]: E1122 10:22:06.740299 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="registry-server" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740308 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="registry-server" Nov 22 10:22:06 crc kubenswrapper[4846]: E1122 10:22:06.740325 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="extract-utilities" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740364 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="extract-utilities" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740901 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d0a755-7470-4c33-9a5e-535c12ba1463" containerName="registry-server" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.740931 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3972120-7c64-4a58-b159-62e07775bfd6" containerName="registry-server" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.742924 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.748505 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5tk6n"/"openshift-service-ca.crt" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.748506 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5tk6n"/"default-dockercfg-c2whh" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.748736 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5tk6n"/"kube-root-ca.crt" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.776839 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5tk6n/must-gather-2g9sk"] Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.898741 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv7cr\" (UniqueName: \"kubernetes.io/projected/73e5c479-31c4-446d-89ff-dae0c3bc674c-kube-api-access-nv7cr\") pod \"must-gather-2g9sk\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:06 crc kubenswrapper[4846]: I1122 10:22:06.898856 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73e5c479-31c4-446d-89ff-dae0c3bc674c-must-gather-output\") pod \"must-gather-2g9sk\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.001244 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv7cr\" (UniqueName: \"kubernetes.io/projected/73e5c479-31c4-446d-89ff-dae0c3bc674c-kube-api-access-nv7cr\") pod \"must-gather-2g9sk\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.001339 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73e5c479-31c4-446d-89ff-dae0c3bc674c-must-gather-output\") pod \"must-gather-2g9sk\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.001828 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73e5c479-31c4-446d-89ff-dae0c3bc674c-must-gather-output\") pod \"must-gather-2g9sk\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.021841 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv7cr\" (UniqueName: \"kubernetes.io/projected/73e5c479-31c4-446d-89ff-dae0c3bc674c-kube-api-access-nv7cr\") pod \"must-gather-2g9sk\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.071218 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.921108 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.925857 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5tk6n/must-gather-2g9sk"] Nov 22 10:22:07 crc kubenswrapper[4846]: W1122 10:22:07.942317 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73e5c479_31c4_446d_89ff_dae0c3bc674c.slice/crio-732c8582679e653b58268a1c7ba3492d0cd8d707894c999dbfdf6c6f00fa81f2 WatchSource:0}: Error finding container 732c8582679e653b58268a1c7ba3492d0cd8d707894c999dbfdf6c6f00fa81f2: Status 404 returned error can't find the container with id 732c8582679e653b58268a1c7ba3492d0cd8d707894c999dbfdf6c6f00fa81f2 Nov 22 10:22:07 crc kubenswrapper[4846]: I1122 10:22:07.976432 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:22:08 crc kubenswrapper[4846]: I1122 10:22:08.166903 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4ghxc"] Nov 22 10:22:08 crc kubenswrapper[4846]: I1122 10:22:08.663504 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" event={"ID":"73e5c479-31c4-446d-89ff-dae0c3bc674c","Type":"ContainerStarted","Data":"43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f"} Nov 22 10:22:08 crc kubenswrapper[4846]: I1122 10:22:08.663833 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" event={"ID":"73e5c479-31c4-446d-89ff-dae0c3bc674c","Type":"ContainerStarted","Data":"a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a"} Nov 22 10:22:08 crc kubenswrapper[4846]: I1122 10:22:08.663848 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" event={"ID":"73e5c479-31c4-446d-89ff-dae0c3bc674c","Type":"ContainerStarted","Data":"732c8582679e653b58268a1c7ba3492d0cd8d707894c999dbfdf6c6f00fa81f2"} Nov 22 10:22:08 crc kubenswrapper[4846]: I1122 10:22:08.686687 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" podStartSLOduration=2.686671685 podStartE2EDuration="2.686671685s" podCreationTimestamp="2025-11-22 10:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:22:08.683093864 +0000 UTC m=+4103.618783513" watchObservedRunningTime="2025-11-22 10:22:08.686671685 +0000 UTC m=+4103.622361334" Nov 22 10:22:09 crc kubenswrapper[4846]: I1122 10:22:09.671277 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4ghxc" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="registry-server" containerID="cri-o://2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a" gracePeriod=2 Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.203595 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.370671 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-catalog-content\") pod \"6eb1271e-ba1e-4592-a579-56210cfd2870\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.370773 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzvjd\" (UniqueName: \"kubernetes.io/projected/6eb1271e-ba1e-4592-a579-56210cfd2870-kube-api-access-nzvjd\") pod \"6eb1271e-ba1e-4592-a579-56210cfd2870\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.370828 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-utilities\") pod \"6eb1271e-ba1e-4592-a579-56210cfd2870\" (UID: \"6eb1271e-ba1e-4592-a579-56210cfd2870\") " Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.372128 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-utilities" (OuterVolumeSpecName: "utilities") pod "6eb1271e-ba1e-4592-a579-56210cfd2870" (UID: "6eb1271e-ba1e-4592-a579-56210cfd2870"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.376763 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb1271e-ba1e-4592-a579-56210cfd2870-kube-api-access-nzvjd" (OuterVolumeSpecName: "kube-api-access-nzvjd") pod "6eb1271e-ba1e-4592-a579-56210cfd2870" (UID: "6eb1271e-ba1e-4592-a579-56210cfd2870"). InnerVolumeSpecName "kube-api-access-nzvjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.475357 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzvjd\" (UniqueName: \"kubernetes.io/projected/6eb1271e-ba1e-4592-a579-56210cfd2870-kube-api-access-nzvjd\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.475736 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.517137 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eb1271e-ba1e-4592-a579-56210cfd2870" (UID: "6eb1271e-ba1e-4592-a579-56210cfd2870"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.577354 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb1271e-ba1e-4592-a579-56210cfd2870-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.682248 4846 generic.go:334] "Generic (PLEG): container finished" podID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerID="2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a" exitCode=0 Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.682292 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerDied","Data":"2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a"} Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.682324 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ghxc" event={"ID":"6eb1271e-ba1e-4592-a579-56210cfd2870","Type":"ContainerDied","Data":"24ff74d806848467a74346363c29f90514791e99a7149830b9e55e187a8a524e"} Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.682344 4846 scope.go:117] "RemoveContainer" containerID="2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.682373 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ghxc" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.715565 4846 scope.go:117] "RemoveContainer" containerID="14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.721579 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4ghxc"] Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.732173 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4ghxc"] Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.742267 4846 scope.go:117] "RemoveContainer" containerID="a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.787636 4846 scope.go:117] "RemoveContainer" containerID="2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a" Nov 22 10:22:10 crc kubenswrapper[4846]: E1122 10:22:10.788071 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a\": container with ID starting with 2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a not found: ID does not exist" containerID="2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.788120 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a"} err="failed to get container status \"2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a\": rpc error: code = NotFound desc = could not find container \"2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a\": container with ID starting with 2903d9e5a4d3c61c0a5c1462a5746e0ff52355d0df4ee0f89c15b5d01760609a not found: ID does not exist" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.788153 4846 scope.go:117] "RemoveContainer" containerID="14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801" Nov 22 10:22:10 crc kubenswrapper[4846]: E1122 10:22:10.788499 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801\": container with ID starting with 14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801 not found: ID does not exist" containerID="14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.788537 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801"} err="failed to get container status \"14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801\": rpc error: code = NotFound desc = could not find container \"14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801\": container with ID starting with 14783cd7a4dcc5b40cb81b74bb1cf9778b837d018e637157d9aa563971411801 not found: ID does not exist" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.788563 4846 scope.go:117] "RemoveContainer" containerID="a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf" Nov 22 10:22:10 crc kubenswrapper[4846]: E1122 10:22:10.788807 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf\": container with ID starting with a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf not found: ID does not exist" containerID="a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf" Nov 22 10:22:10 crc kubenswrapper[4846]: I1122 10:22:10.788864 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf"} err="failed to get container status \"a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf\": rpc error: code = NotFound desc = could not find container \"a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf\": container with ID starting with a083fcf97429567acc355a1e9796afff55383ba9a7f2cf9313cb97207beaf5bf not found: ID does not exist" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.654811 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-ffw7w"] Nov 22 10:22:11 crc kubenswrapper[4846]: E1122 10:22:11.655588 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="extract-utilities" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.655634 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="extract-utilities" Nov 22 10:22:11 crc kubenswrapper[4846]: E1122 10:22:11.655676 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="registry-server" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.655685 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="registry-server" Nov 22 10:22:11 crc kubenswrapper[4846]: E1122 10:22:11.655699 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="extract-content" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.655707 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="extract-content" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.655959 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" containerName="registry-server" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.656713 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.696131 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71a27880-5135-4b7e-94fe-63431abea0fe-host\") pod \"crc-debug-ffw7w\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.696187 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtcx\" (UniqueName: \"kubernetes.io/projected/71a27880-5135-4b7e-94fe-63431abea0fe-kube-api-access-drtcx\") pod \"crc-debug-ffw7w\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.798329 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71a27880-5135-4b7e-94fe-63431abea0fe-host\") pod \"crc-debug-ffw7w\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.798380 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtcx\" (UniqueName: \"kubernetes.io/projected/71a27880-5135-4b7e-94fe-63431abea0fe-kube-api-access-drtcx\") pod \"crc-debug-ffw7w\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.798443 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71a27880-5135-4b7e-94fe-63431abea0fe-host\") pod \"crc-debug-ffw7w\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.818879 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtcx\" (UniqueName: \"kubernetes.io/projected/71a27880-5135-4b7e-94fe-63431abea0fe-kube-api-access-drtcx\") pod \"crc-debug-ffw7w\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:11 crc kubenswrapper[4846]: I1122 10:22:11.979139 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:12 crc kubenswrapper[4846]: I1122 10:22:12.045492 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb1271e-ba1e-4592-a579-56210cfd2870" path="/var/lib/kubelet/pods/6eb1271e-ba1e-4592-a579-56210cfd2870/volumes" Nov 22 10:22:12 crc kubenswrapper[4846]: I1122 10:22:12.701834 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" event={"ID":"71a27880-5135-4b7e-94fe-63431abea0fe","Type":"ContainerStarted","Data":"e37e27c5aa2eea820b8d461f2c743dd6f2f42026098a9fe8cca0241ebce2555b"} Nov 22 10:22:12 crc kubenswrapper[4846]: I1122 10:22:12.702626 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" event={"ID":"71a27880-5135-4b7e-94fe-63431abea0fe","Type":"ContainerStarted","Data":"58449da54c9a9ca6cd5920248ac7db270f1c66a5d62f36b459cb43f4b293c9f6"} Nov 22 10:22:12 crc kubenswrapper[4846]: I1122 10:22:12.716884 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" podStartSLOduration=1.716860219 podStartE2EDuration="1.716860219s" podCreationTimestamp="2025-11-22 10:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:22:12.716021385 +0000 UTC m=+4107.651711044" watchObservedRunningTime="2025-11-22 10:22:12.716860219 +0000 UTC m=+4107.652549908" Nov 22 10:22:28 crc kubenswrapper[4846]: I1122 10:22:28.627079 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:22:28 crc kubenswrapper[4846]: I1122 10:22:28.628667 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:22:46 crc kubenswrapper[4846]: I1122 10:22:46.140611 4846 generic.go:334] "Generic (PLEG): container finished" podID="71a27880-5135-4b7e-94fe-63431abea0fe" containerID="e37e27c5aa2eea820b8d461f2c743dd6f2f42026098a9fe8cca0241ebce2555b" exitCode=0 Nov 22 10:22:46 crc kubenswrapper[4846]: I1122 10:22:46.140693 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" event={"ID":"71a27880-5135-4b7e-94fe-63431abea0fe","Type":"ContainerDied","Data":"e37e27c5aa2eea820b8d461f2c743dd6f2f42026098a9fe8cca0241ebce2555b"} Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.265094 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.295888 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-ffw7w"] Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.303559 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-ffw7w"] Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.313329 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drtcx\" (UniqueName: \"kubernetes.io/projected/71a27880-5135-4b7e-94fe-63431abea0fe-kube-api-access-drtcx\") pod \"71a27880-5135-4b7e-94fe-63431abea0fe\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.313590 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71a27880-5135-4b7e-94fe-63431abea0fe-host\") pod \"71a27880-5135-4b7e-94fe-63431abea0fe\" (UID: \"71a27880-5135-4b7e-94fe-63431abea0fe\") " Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.313715 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71a27880-5135-4b7e-94fe-63431abea0fe-host" (OuterVolumeSpecName: "host") pod "71a27880-5135-4b7e-94fe-63431abea0fe" (UID: "71a27880-5135-4b7e-94fe-63431abea0fe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.314162 4846 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71a27880-5135-4b7e-94fe-63431abea0fe-host\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.320265 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a27880-5135-4b7e-94fe-63431abea0fe-kube-api-access-drtcx" (OuterVolumeSpecName: "kube-api-access-drtcx") pod "71a27880-5135-4b7e-94fe-63431abea0fe" (UID: "71a27880-5135-4b7e-94fe-63431abea0fe"). InnerVolumeSpecName "kube-api-access-drtcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:47 crc kubenswrapper[4846]: I1122 10:22:47.416067 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drtcx\" (UniqueName: \"kubernetes.io/projected/71a27880-5135-4b7e-94fe-63431abea0fe-kube-api-access-drtcx\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.045255 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a27880-5135-4b7e-94fe-63431abea0fe" path="/var/lib/kubelet/pods/71a27880-5135-4b7e-94fe-63431abea0fe/volumes" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.160315 4846 scope.go:117] "RemoveContainer" containerID="e37e27c5aa2eea820b8d461f2c743dd6f2f42026098a9fe8cca0241ebce2555b" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.160326 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-ffw7w" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.520033 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-klk77"] Nov 22 10:22:48 crc kubenswrapper[4846]: E1122 10:22:48.520499 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a27880-5135-4b7e-94fe-63431abea0fe" containerName="container-00" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.520512 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a27880-5135-4b7e-94fe-63431abea0fe" containerName="container-00" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.520723 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a27880-5135-4b7e-94fe-63431abea0fe" containerName="container-00" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.521358 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.639765 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8f8t\" (UniqueName: \"kubernetes.io/projected/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-kube-api-access-g8f8t\") pod \"crc-debug-klk77\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.640142 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-host\") pod \"crc-debug-klk77\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.742205 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-host\") pod \"crc-debug-klk77\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.742337 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-host\") pod \"crc-debug-klk77\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.742370 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8f8t\" (UniqueName: \"kubernetes.io/projected/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-kube-api-access-g8f8t\") pod \"crc-debug-klk77\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.762709 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8f8t\" (UniqueName: \"kubernetes.io/projected/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-kube-api-access-g8f8t\") pod \"crc-debug-klk77\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: I1122 10:22:48.842571 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:48 crc kubenswrapper[4846]: W1122 10:22:48.879280 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce8ed9d0_fe50_44b9_a38e_4a4debe2e336.slice/crio-9728e9cc347e74876d55c0bdae82629585a7b542cc09b76f3038f99950774383 WatchSource:0}: Error finding container 9728e9cc347e74876d55c0bdae82629585a7b542cc09b76f3038f99950774383: Status 404 returned error can't find the container with id 9728e9cc347e74876d55c0bdae82629585a7b542cc09b76f3038f99950774383 Nov 22 10:22:49 crc kubenswrapper[4846]: I1122 10:22:49.174241 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-klk77" event={"ID":"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336","Type":"ContainerStarted","Data":"20a25fd308526627fc80efd25792b2a5aedf0e50ee5db99910be644fae550cd9"} Nov 22 10:22:49 crc kubenswrapper[4846]: I1122 10:22:49.174535 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-klk77" event={"ID":"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336","Type":"ContainerStarted","Data":"9728e9cc347e74876d55c0bdae82629585a7b542cc09b76f3038f99950774383"} Nov 22 10:22:49 crc kubenswrapper[4846]: I1122 10:22:49.622768 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-klk77"] Nov 22 10:22:49 crc kubenswrapper[4846]: I1122 10:22:49.630922 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-klk77"] Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.186619 4846 generic.go:334] "Generic (PLEG): container finished" podID="ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" containerID="20a25fd308526627fc80efd25792b2a5aedf0e50ee5db99910be644fae550cd9" exitCode=0 Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.313527 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.376842 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-host\") pod \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.377026 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-host" (OuterVolumeSpecName: "host") pod "ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" (UID: "ce8ed9d0-fe50-44b9-a38e-4a4debe2e336"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.377062 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8f8t\" (UniqueName: \"kubernetes.io/projected/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-kube-api-access-g8f8t\") pod \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\" (UID: \"ce8ed9d0-fe50-44b9-a38e-4a4debe2e336\") " Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.378145 4846 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-host\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.388398 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-kube-api-access-g8f8t" (OuterVolumeSpecName: "kube-api-access-g8f8t") pod "ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" (UID: "ce8ed9d0-fe50-44b9-a38e-4a4debe2e336"). InnerVolumeSpecName "kube-api-access-g8f8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:50 crc kubenswrapper[4846]: I1122 10:22:50.480118 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8f8t\" (UniqueName: \"kubernetes.io/projected/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336-kube-api-access-g8f8t\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.067559 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-pjdq6"] Nov 22 10:22:51 crc kubenswrapper[4846]: E1122 10:22:51.068241 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" containerName="container-00" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.068255 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" containerName="container-00" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.068455 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" containerName="container-00" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.069090 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.195663 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdnr\" (UniqueName: \"kubernetes.io/projected/031f5331-896b-45e3-b973-b03ab16eb967-kube-api-access-vgdnr\") pod \"crc-debug-pjdq6\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.195749 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/031f5331-896b-45e3-b973-b03ab16eb967-host\") pod \"crc-debug-pjdq6\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.197004 4846 scope.go:117] "RemoveContainer" containerID="20a25fd308526627fc80efd25792b2a5aedf0e50ee5db99910be644fae550cd9" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.197065 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-klk77" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.297710 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdnr\" (UniqueName: \"kubernetes.io/projected/031f5331-896b-45e3-b973-b03ab16eb967-kube-api-access-vgdnr\") pod \"crc-debug-pjdq6\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.297757 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/031f5331-896b-45e3-b973-b03ab16eb967-host\") pod \"crc-debug-pjdq6\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.297979 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/031f5331-896b-45e3-b973-b03ab16eb967-host\") pod \"crc-debug-pjdq6\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.318482 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdnr\" (UniqueName: \"kubernetes.io/projected/031f5331-896b-45e3-b973-b03ab16eb967-kube-api-access-vgdnr\") pod \"crc-debug-pjdq6\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: I1122 10:22:51.385017 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:51 crc kubenswrapper[4846]: W1122 10:22:51.424606 4846 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031f5331_896b_45e3_b973_b03ab16eb967.slice/crio-9e01aeeb7e26328c82226fe0e170c31476b55d1b11b4d2ca9a7ac880f57c9531 WatchSource:0}: Error finding container 9e01aeeb7e26328c82226fe0e170c31476b55d1b11b4d2ca9a7ac880f57c9531: Status 404 returned error can't find the container with id 9e01aeeb7e26328c82226fe0e170c31476b55d1b11b4d2ca9a7ac880f57c9531 Nov 22 10:22:52 crc kubenswrapper[4846]: I1122 10:22:52.048667 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8ed9d0-fe50-44b9-a38e-4a4debe2e336" path="/var/lib/kubelet/pods/ce8ed9d0-fe50-44b9-a38e-4a4debe2e336/volumes" Nov 22 10:22:52 crc kubenswrapper[4846]: I1122 10:22:52.222728 4846 generic.go:334] "Generic (PLEG): container finished" podID="031f5331-896b-45e3-b973-b03ab16eb967" containerID="fd407f4808f1a986e155d966ca9071c1a0a38184a0d31c4c07852b8dbc3b0de4" exitCode=0 Nov 22 10:22:52 crc kubenswrapper[4846]: I1122 10:22:52.222987 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" event={"ID":"031f5331-896b-45e3-b973-b03ab16eb967","Type":"ContainerDied","Data":"fd407f4808f1a986e155d966ca9071c1a0a38184a0d31c4c07852b8dbc3b0de4"} Nov 22 10:22:52 crc kubenswrapper[4846]: I1122 10:22:52.223037 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" event={"ID":"031f5331-896b-45e3-b973-b03ab16eb967","Type":"ContainerStarted","Data":"9e01aeeb7e26328c82226fe0e170c31476b55d1b11b4d2ca9a7ac880f57c9531"} Nov 22 10:22:52 crc kubenswrapper[4846]: I1122 10:22:52.273645 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-pjdq6"] Nov 22 10:22:52 crc kubenswrapper[4846]: I1122 10:22:52.286764 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tk6n/crc-debug-pjdq6"] Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.355434 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.436587 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/031f5331-896b-45e3-b973-b03ab16eb967-host\") pod \"031f5331-896b-45e3-b973-b03ab16eb967\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.436682 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/031f5331-896b-45e3-b973-b03ab16eb967-host" (OuterVolumeSpecName: "host") pod "031f5331-896b-45e3-b973-b03ab16eb967" (UID: "031f5331-896b-45e3-b973-b03ab16eb967"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.436717 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdnr\" (UniqueName: \"kubernetes.io/projected/031f5331-896b-45e3-b973-b03ab16eb967-kube-api-access-vgdnr\") pod \"031f5331-896b-45e3-b973-b03ab16eb967\" (UID: \"031f5331-896b-45e3-b973-b03ab16eb967\") " Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.437257 4846 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/031f5331-896b-45e3-b973-b03ab16eb967-host\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.442922 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031f5331-896b-45e3-b973-b03ab16eb967-kube-api-access-vgdnr" (OuterVolumeSpecName: "kube-api-access-vgdnr") pod "031f5331-896b-45e3-b973-b03ab16eb967" (UID: "031f5331-896b-45e3-b973-b03ab16eb967"). InnerVolumeSpecName "kube-api-access-vgdnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:53 crc kubenswrapper[4846]: I1122 10:22:53.539624 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgdnr\" (UniqueName: \"kubernetes.io/projected/031f5331-896b-45e3-b973-b03ab16eb967-kube-api-access-vgdnr\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:54 crc kubenswrapper[4846]: I1122 10:22:54.046636 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031f5331-896b-45e3-b973-b03ab16eb967" path="/var/lib/kubelet/pods/031f5331-896b-45e3-b973-b03ab16eb967/volumes" Nov 22 10:22:54 crc kubenswrapper[4846]: I1122 10:22:54.248301 4846 scope.go:117] "RemoveContainer" containerID="fd407f4808f1a986e155d966ca9071c1a0a38184a0d31c4c07852b8dbc3b0de4" Nov 22 10:22:54 crc kubenswrapper[4846]: I1122 10:22:54.248360 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/crc-debug-pjdq6" Nov 22 10:22:58 crc kubenswrapper[4846]: I1122 10:22:58.625854 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:22:58 crc kubenswrapper[4846]: I1122 10:22:58.626295 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:23:19 crc kubenswrapper[4846]: I1122 10:23:19.582596 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55fdfc87fd-75r6l_54092b40-6b71-4920-b703-b6b44e0e2331/barbican-api/0.log" Nov 22 10:23:19 crc kubenswrapper[4846]: I1122 10:23:19.737501 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55fdfc87fd-75r6l_54092b40-6b71-4920-b703-b6b44e0e2331/barbican-api-log/0.log" Nov 22 10:23:19 crc kubenswrapper[4846]: I1122 10:23:19.763548 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5ff9f749db-lj4qc_d45bb639-d116-4666-8aea-ba5bc8ca84ea/barbican-keystone-listener/0.log" Nov 22 10:23:19 crc kubenswrapper[4846]: I1122 10:23:19.851475 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5ff9f749db-lj4qc_d45bb639-d116-4666-8aea-ba5bc8ca84ea/barbican-keystone-listener-log/0.log" Nov 22 10:23:19 crc kubenswrapper[4846]: I1122 10:23:19.928536 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b796967d9-trff5_03409e82-9b6d-43ee-a770-96700e162fac/barbican-worker/0.log" Nov 22 10:23:19 crc kubenswrapper[4846]: I1122 10:23:19.993676 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b796967d9-trff5_03409e82-9b6d-43ee-a770-96700e162fac/barbican-worker-log/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.131169 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cs7kw_2b50be33-843f-4f51-af42-decfb29306c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.290131 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/ceilometer-central-agent/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.307769 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/proxy-httpd/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.356891 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/ceilometer-notification-agent/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.407389 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b8240da2-e07e-4b79-81b7-4dffdf4b4c91/sg-core/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.529494 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2b86aa01-1c05-47da-9f91-ef71a5e6d7ec/cinder-api-log/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.540478 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2b86aa01-1c05-47da-9f91-ef71a5e6d7ec/cinder-api/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.791603 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ea0ad07d-59fe-4c26-b1a7-69b9181631d8/probe/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.795775 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ea0ad07d-59fe-4c26-b1a7-69b9181631d8/cinder-scheduler/0.log" Nov 22 10:23:20 crc kubenswrapper[4846]: I1122 10:23:20.831570 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-crlnp_4e8248db-f0c2-40ad-a534-e3076fae3466/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:21 crc kubenswrapper[4846]: I1122 10:23:21.737881 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-prvrx_3db36453-67bc-491e-b87f-df3a840178b1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:21 crc kubenswrapper[4846]: I1122 10:23:21.767226 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b22zv_fb7382e7-13c7-4cf5-9462-b58b330e0315/init/0.log" Nov 22 10:23:21 crc kubenswrapper[4846]: I1122 10:23:21.894226 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b22zv_fb7382e7-13c7-4cf5-9462-b58b330e0315/init/0.log" Nov 22 10:23:21 crc kubenswrapper[4846]: I1122 10:23:21.960017 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-grpjk_ee2ff4f5-0353-438b-850b-81b49a3d22ad/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:21 crc kubenswrapper[4846]: I1122 10:23:21.991980 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b22zv_fb7382e7-13c7-4cf5-9462-b58b330e0315/dnsmasq-dns/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.165098 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_85455dd3-3442-40ad-bd48-80034e877a41/glance-httpd/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.195753 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_85455dd3-3442-40ad-bd48-80034e877a41/glance-log/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.349532 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_554a6b70-9c9c-4afd-9738-d207b3067a30/glance-httpd/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.357580 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_554a6b70-9c9c-4afd-9738-d207b3067a30/glance-log/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.523424 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dfd5ccb4b-fpl7v_76c862f1-2cb3-4598-9be8-f8ff8bbab6f3/horizon/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.650239 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-78vtn_39001bd9-e368-4530-be7d-97c756cb4d39/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:22 crc kubenswrapper[4846]: I1122 10:23:22.960222 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5dfd5ccb4b-fpl7v_76c862f1-2cb3-4598-9be8-f8ff8bbab6f3/horizon-log/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.275520 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-bs5cw_c12b9ae4-5d39-4ce1-bca3-8b128038532e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.289836 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29396761-j28rw_06e59565-2673-4e50-a150-a4f336c8dbfe/keystone-cron/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.458995 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5ccd94b5cf-fd5rp_fe29ba72-dfe7-4536-bf56-c282d31d2acb/keystone-api/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.506574 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4377a3fa-e17a-42e4-ab0b-37f76e90dbf9/kube-state-metrics/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.566862 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-nnbxl_06a4ae02-37d7-458b-879a-64951da9e75a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.894181 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f98fdfc57-v8bnv_eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4/neutron-httpd/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.924315 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f98fdfc57-v8bnv_eaf34e1b-cf2c-4ff7-aaff-b8a0b35514b4/neutron-api/0.log" Nov 22 10:23:23 crc kubenswrapper[4846]: I1122 10:23:23.995314 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kpnt4_34347b18-5391-4078-8165-175276d8747e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:24 crc kubenswrapper[4846]: I1122 10:23:24.599793 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_732fe70d-07f5-455f-b20a-5a4d0c92c764/nova-api-log/0.log" Nov 22 10:23:24 crc kubenswrapper[4846]: I1122 10:23:24.650428 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_525d7ecc-cc33-4162-82f8-bfa33a4b15ed/nova-cell0-conductor-conductor/0.log" Nov 22 10:23:24 crc kubenswrapper[4846]: I1122 10:23:24.945691 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7b0d3ce0-49e4-4e73-b2e9-ce405a023987/nova-cell1-conductor-conductor/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.027198 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_faa4297f-4d7b-4942-958c-ccc0f3891f2a/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.083487 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_732fe70d-07f5-455f-b20a-5a4d0c92c764/nova-api-api/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.199623 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nl69g_e51d5d70-b3f1-41e3-b6c4-f3bf9b569417/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.317479 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea/nova-metadata-log/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.645223 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4e66c89-9999-4584-a149-2c18589a522a/mysql-bootstrap/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.672573 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_177c2c54-1d5d-409c-8592-141b25fc59cc/nova-scheduler-scheduler/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.884562 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4e66c89-9999-4584-a149-2c18589a522a/galera/0.log" Nov 22 10:23:25 crc kubenswrapper[4846]: I1122 10:23:25.886785 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4e66c89-9999-4584-a149-2c18589a522a/mysql-bootstrap/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.087130 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93cad534-86a5-4420-951f-859efc86a70a/mysql-bootstrap/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.272486 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93cad534-86a5-4420-951f-859efc86a70a/galera/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.293373 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93cad534-86a5-4420-951f-859efc86a70a/mysql-bootstrap/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.462961 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0e37cf7b-6c4e-44c5-8193-38a0888efeee/openstackclient/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.576592 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-576fl_65c370a7-5d69-437a-98d2-810e97b9a5b7/ovn-controller/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.657324 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79fac89e-72b8-4ee5-a2b4-4a56caf2a3ea/nova-metadata-metadata/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.820537 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bnq8b_80b3f55d-b10e-40f1-9d45-4ed801491f54/openstack-network-exporter/0.log" Nov 22 10:23:26 crc kubenswrapper[4846]: I1122 10:23:26.959923 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovsdb-server-init/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.069070 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovsdb-server-init/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.074471 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovs-vswitchd/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.107280 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bdxdm_9315fa04-bcf9-4013-be72-f29a5cf95f4e/ovsdb-server/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.309217 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rfbkn_d326c85b-6234-469b-b6f4-8a4d72b62dab/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.384305 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa80bcbe-b4a6-4515-b366-9ba9b0d92440/openstack-network-exporter/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.417672 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa80bcbe-b4a6-4515-b366-9ba9b0d92440/ovn-northd/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.516395 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5c5e879-a8c6-4758-a577-00d371164c9d/openstack-network-exporter/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.600320 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5c5e879-a8c6-4758-a577-00d371164c9d/ovsdbserver-nb/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.747540 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aff4ba43-41a2-420b-8f89-99c69c1f3cfc/openstack-network-exporter/0.log" Nov 22 10:23:27 crc kubenswrapper[4846]: I1122 10:23:27.770666 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aff4ba43-41a2-420b-8f89-99c69c1f3cfc/ovsdbserver-sb/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.037561 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-569956d6b4-jtk8r_2f8c4b78-83b6-4f98-a4e2-ef7f56043775/placement-api/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.229574 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-569956d6b4-jtk8r_2f8c4b78-83b6-4f98-a4e2-ef7f56043775/placement-log/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.244807 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_812351d5-d992-4243-94c9-3328217b37b9/setup-container/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.442116 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_812351d5-d992-4243-94c9-3328217b37b9/setup-container/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.457876 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_812351d5-d992-4243-94c9-3328217b37b9/rabbitmq/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.523867 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b44e9aa-f202-48be-bace-279f29824c1b/setup-container/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.625263 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.625330 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.625382 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.626305 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3447fd2376a9e62f224b7e26d25446dcd902e6769cfc416d5b85222bd3cdb68"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.626401 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://e3447fd2376a9e62f224b7e26d25446dcd902e6769cfc416d5b85222bd3cdb68" gracePeriod=600 Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.776134 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b44e9aa-f202-48be-bace-279f29824c1b/setup-container/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.785596 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-5x2rg_0364c9c7-ad57-4109-bdf0-9c888a609515/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.793474 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b44e9aa-f202-48be-bace-279f29824c1b/rabbitmq/0.log" Nov 22 10:23:28 crc kubenswrapper[4846]: I1122 10:23:28.985651 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-dmvg6_da832f40-8579-415e-82c8-3e66684eb241/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.020783 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cbfrn_2565b5ab-c381-4a01-bc51-98d00dc7ce25/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.273561 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gkqr4_6dfb2ed1-5b77-4f1e-8f7b-59bc4ea8a297/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.345210 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t6fpn_cf943f33-8c4e-4195-aa85-c1f60841b9ab/ssh-known-hosts-edpm-deployment/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.556100 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-86d575f679-k6l72_52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2/proxy-server/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.575878 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="e3447fd2376a9e62f224b7e26d25446dcd902e6769cfc416d5b85222bd3cdb68" exitCode=0 Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.575919 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"e3447fd2376a9e62f224b7e26d25446dcd902e6769cfc416d5b85222bd3cdb68"} Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.575971 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerStarted","Data":"6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af"} Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.575992 4846 scope.go:117] "RemoveContainer" containerID="fff79376dadc6262d85f76ce2a671cc10b339d01b7fb1d406fa3fe417ac30b88" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.629354 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-86d575f679-k6l72_52e14f86-9c5c-4f9b-8af9-2bcb9e356bb2/proxy-httpd/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.695336 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hgdvt_6f537097-bfac-4915-833f-ee9a52e7d8a5/swift-ring-rebalance/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.810599 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-auditor/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.842959 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-reaper/0.log" Nov 22 10:23:29 crc kubenswrapper[4846]: I1122 10:23:29.910966 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-replicator/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.016652 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/account-server/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.023060 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-auditor/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.070865 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-replicator/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.120359 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-server/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.788955 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/container-updater/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.818285 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-auditor/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.865683 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-replicator/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.867941 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-expirer/0.log" Nov 22 10:23:30 crc kubenswrapper[4846]: I1122 10:23:30.999348 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-updater/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.061823 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/object-server/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.096505 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/rsync/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.111462 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_575c6d2b-ae18-48ec-a314-211ccd078d87/swift-recon-cron/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.302339 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wd8cm_7fa86a5b-2dbc-4e12-bf49-ea58d02854b0/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.361600 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0746377b-0ff5-4289-b4b6-1e9c3a166533/tempest-tests-tempest-tests-runner/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.559295 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_423242c9-5d5f-4a1d-83db-13989d8d78b1/test-operator-logs-container/0.log" Nov 22 10:23:31 crc kubenswrapper[4846]: I1122 10:23:31.648957 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x2rtz_cfeec82e-6d58-4819-8715-7d0febbe480c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 10:23:41 crc kubenswrapper[4846]: I1122 10:23:41.830308 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4d174fc1-bcf2-4812-9766-875d3ca3efe5/memcached/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.293729 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/util/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.513489 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/pull/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.524942 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/util/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.542293 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/pull/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.736008 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/extract/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.769207 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/util/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.774404 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_473a8e94cedda694fa3055e2c9a38fb258b433bd86bc998159f2bf3f8b4cc4x_be2f7b8e-cdfc-4405-a4a7-9d835a12da05/pull/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.923879 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcqgd_01860b24-58b0-422d-a390-fc783a2f4990/kube-rbac-proxy/0.log" Nov 22 10:23:57 crc kubenswrapper[4846]: I1122 10:23:57.991853 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcqgd_01860b24-58b0-422d-a390-fc783a2f4990/manager/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.036245 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-r4xp2_371bad3e-fcc3-42c5-a563-fc7d6aa5f275/kube-rbac-proxy/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.206666 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-r4xp2_371bad3e-fcc3-42c5-a563-fc7d6aa5f275/manager/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.300668 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-rb6cp_f1008fc2-d21a-4775-8505-12116c0a1d94/kube-rbac-proxy/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.303130 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-rb6cp_f1008fc2-d21a-4775-8505-12116c0a1d94/manager/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.409160 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-fwlcn_4dac3679-62ae-408f-b3ba-1809daaceb47/kube-rbac-proxy/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.538155 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-fwlcn_4dac3679-62ae-408f-b3ba-1809daaceb47/manager/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.597985 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-gqktx_539f5169-bf3b-4c3c-828a-8490d4d758d8/kube-rbac-proxy/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.607575 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-gqktx_539f5169-bf3b-4c3c-828a-8490d4d758d8/manager/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.719091 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-vt8xd_c4abfa7d-5927-41f1-af53-bc1ea6878bc1/kube-rbac-proxy/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.786158 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-vt8xd_c4abfa7d-5927-41f1-af53-bc1ea6878bc1/manager/0.log" Nov 22 10:23:58 crc kubenswrapper[4846]: I1122 10:23:58.867752 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6ccc968f7b-dxpcq_f7cb339f-9ebe-441d-ae17-43ad2ce13201/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.033064 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-9wwm2_22e226e0-ebce-4d63-9379-109fe06b88da/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.055354 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6ccc968f7b-dxpcq_f7cb339f-9ebe-441d-ae17-43ad2ce13201/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.076318 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-9wwm2_22e226e0-ebce-4d63-9379-109fe06b88da/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.227798 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-ltkkr_ab7af809-056a-45c1-bdd0-5e4a8bea02ef/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.336979 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-ltkkr_ab7af809-056a-45c1-bdd0-5e4a8bea02ef/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.375681 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-c8ssm_e84b7960-5cd2-4557-9b3c-a98ed4784006/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.439911 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-c8ssm_e84b7960-5cd2-4557-9b3c-a98ed4784006/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.522499 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-jbjc4_f4a50a36-b951-4342-b092-c94bea3d860e/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.547070 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-jbjc4_f4a50a36-b951-4342-b092-c94bea3d860e/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.672620 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-jwx4v_07d22ef0-2712-4daf-a620-081fee41f68f/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.734727 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-jwx4v_07d22ef0-2712-4daf-a620-081fee41f68f/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.797800 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-t8nfb_766c68ab-9022-4efd-84a3-af4aedf7d7b2/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.928406 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-t8nfb_766c68ab-9022-4efd-84a3-af4aedf7d7b2/manager/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.944491 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-9jbkf_facf2ae5-028f-4413-a2d6-e503489ae5f3/kube-rbac-proxy/0.log" Nov 22 10:23:59 crc kubenswrapper[4846]: I1122 10:23:59.995172 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-9jbkf_facf2ae5-028f-4413-a2d6-e503489ae5f3/manager/0.log" Nov 22 10:24:00 crc kubenswrapper[4846]: I1122 10:24:00.103691 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-thll2_eaacbd1d-48b7-40a3-b7e4-48fc074e37fb/kube-rbac-proxy/0.log" Nov 22 10:24:00 crc kubenswrapper[4846]: I1122 10:24:00.119776 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-thll2_eaacbd1d-48b7-40a3-b7e4-48fc074e37fb/manager/0.log" Nov 22 10:24:00 crc kubenswrapper[4846]: I1122 10:24:00.933615 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67485f68cb-z25cl_7fa6485e-01f3-43e7-ac4e-f639cd3983d5/kube-rbac-proxy/0.log" Nov 22 10:24:00 crc kubenswrapper[4846]: I1122 10:24:00.959006 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-559dfbff4-8cpxr_d1b0081a-5f33-484d-8250-9ec2ab872b64/kube-rbac-proxy/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.339197 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bh6tp_562d6113-df0b-4993-b7b8-1cace4f13fe0/registry-server/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.488554 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-559dfbff4-8cpxr_d1b0081a-5f33-484d-8250-9ec2ab872b64/operator/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.523958 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-458hx_5454b9eb-3a18-47d6-ba8e-1b7230659b26/kube-rbac-proxy/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.650804 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-458hx_5454b9eb-3a18-47d6-ba8e-1b7230659b26/manager/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.655731 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-7lg8k_671de1b8-d3f3-4a1e-8572-e2840bf58e17/kube-rbac-proxy/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.768571 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-7lg8k_671de1b8-d3f3-4a1e-8572-e2840bf58e17/manager/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.884324 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67485f68cb-z25cl_7fa6485e-01f3-43e7-ac4e-f639cd3983d5/manager/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.910944 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7v42k_f6c75a18-6338-4da3-8b61-a973a8589e66/operator/0.log" Nov 22 10:24:01 crc kubenswrapper[4846]: I1122 10:24:01.977807 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-cnhgp_113bd687-6dff-4159-b034-3a27a0683260/kube-rbac-proxy/0.log" Nov 22 10:24:02 crc kubenswrapper[4846]: I1122 10:24:02.908819 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-cnhgp_113bd687-6dff-4159-b034-3a27a0683260/manager/0.log" Nov 22 10:24:02 crc kubenswrapper[4846]: I1122 10:24:02.914546 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-qbwl5_63f03060-74f5-437f-bb06-a2626c791a06/manager/0.log" Nov 22 10:24:02 crc kubenswrapper[4846]: I1122 10:24:02.917317 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-qbwl5_63f03060-74f5-437f-bb06-a2626c791a06/kube-rbac-proxy/0.log" Nov 22 10:24:03 crc kubenswrapper[4846]: I1122 10:24:03.089004 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-7f7qj_895d9e5d-08de-4611-a844-c2db9e8e1839/kube-rbac-proxy/0.log" Nov 22 10:24:03 crc kubenswrapper[4846]: I1122 10:24:03.116248 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-644f7_8cec188d-f264-4a62-96f1-93e309820fe6/kube-rbac-proxy/0.log" Nov 22 10:24:03 crc kubenswrapper[4846]: I1122 10:24:03.131739 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-644f7_8cec188d-f264-4a62-96f1-93e309820fe6/manager/0.log" Nov 22 10:24:03 crc kubenswrapper[4846]: I1122 10:24:03.155575 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-7f7qj_895d9e5d-08de-4611-a844-c2db9e8e1839/manager/0.log" Nov 22 10:24:19 crc kubenswrapper[4846]: I1122 10:24:19.956809 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-b95xr_ee403130-f909-4216-a9ff-8a4cb41d4017/control-plane-machine-set-operator/0.log" Nov 22 10:24:20 crc kubenswrapper[4846]: I1122 10:24:20.127007 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lz8p8_01e5ec75-28e3-4baa-8501-cbe8c740ec3f/machine-api-operator/0.log" Nov 22 10:24:20 crc kubenswrapper[4846]: I1122 10:24:20.150284 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lz8p8_01e5ec75-28e3-4baa-8501-cbe8c740ec3f/kube-rbac-proxy/0.log" Nov 22 10:24:35 crc kubenswrapper[4846]: I1122 10:24:35.210365 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nb9wl_1d9cea2b-9f89-437e-a0f3-875b123a47d3/cert-manager-controller/0.log" Nov 22 10:24:35 crc kubenswrapper[4846]: I1122 10:24:35.836312 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d4jkh_c55358d6-9876-4e6a-9b06-08db6080a803/cert-manager-cainjector/0.log" Nov 22 10:24:35 crc kubenswrapper[4846]: I1122 10:24:35.861151 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mppx8_27cc0714-ab99-4ddc-9e9c-66f24bba9fac/cert-manager-webhook/0.log" Nov 22 10:24:48 crc kubenswrapper[4846]: I1122 10:24:48.563952 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-zplwx_ea454b74-e77b-4f90-8311-563ab0e66191/nmstate-console-plugin/0.log" Nov 22 10:24:48 crc kubenswrapper[4846]: I1122 10:24:48.740305 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lpsrm_dd1e7111-d57d-44c4-bcdb-7045dc626f01/nmstate-handler/0.log" Nov 22 10:24:48 crc kubenswrapper[4846]: I1122 10:24:48.772216 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-bkbgk_9173cda0-1bab-4e52-96e3-4e3c564b846f/kube-rbac-proxy/0.log" Nov 22 10:24:48 crc kubenswrapper[4846]: I1122 10:24:48.814583 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-bkbgk_9173cda0-1bab-4e52-96e3-4e3c564b846f/nmstate-metrics/0.log" Nov 22 10:24:48 crc kubenswrapper[4846]: I1122 10:24:48.949493 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-629pv_1fecb21a-594d-4e4f-a063-37cbf0e0d5ea/nmstate-operator/0.log" Nov 22 10:24:48 crc kubenswrapper[4846]: I1122 10:24:48.990377 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-55lsz_c6f850af-f692-4fa4-b289-1fd426f79090/nmstate-webhook/0.log" Nov 22 10:25:02 crc kubenswrapper[4846]: I1122 10:25:02.717804 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-z5dqq_425496ff-38a1-4d67-b702-9bb864465158/kube-rbac-proxy/0.log" Nov 22 10:25:02 crc kubenswrapper[4846]: I1122 10:25:02.838687 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-z5dqq_425496ff-38a1-4d67-b702-9bb864465158/controller/0.log" Nov 22 10:25:02 crc kubenswrapper[4846]: I1122 10:25:02.925343 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.100331 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.103936 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.109662 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.170450 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.283665 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.285112 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.305872 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.376257 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.538616 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-reloader/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.541544 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-frr-files/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.550309 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/cp-metrics/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.609135 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/controller/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.760369 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/frr-metrics/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.760897 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/kube-rbac-proxy/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.849354 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/kube-rbac-proxy-frr/0.log" Nov 22 10:25:03 crc kubenswrapper[4846]: I1122 10:25:03.969252 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/reloader/0.log" Nov 22 10:25:04 crc kubenswrapper[4846]: I1122 10:25:04.056865 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-g4nfr_ae71f435-af46-44ac-afdb-57dea9cd1925/frr-k8s-webhook-server/0.log" Nov 22 10:25:04 crc kubenswrapper[4846]: I1122 10:25:04.584105 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6b9465489d-lwlfq_a9a28c92-48ff-4026-819b-70068881c12b/manager/0.log" Nov 22 10:25:04 crc kubenswrapper[4846]: I1122 10:25:04.808540 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57ff77b6c8-sd485_2c8911b3-3f77-4666-822f-e40c1100c67f/webhook-server/0.log" Nov 22 10:25:04 crc kubenswrapper[4846]: I1122 10:25:04.903399 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtgbm_b4e18041-980a-4cbb-ba17-98b3f6032c57/kube-rbac-proxy/0.log" Nov 22 10:25:05 crc kubenswrapper[4846]: I1122 10:25:05.076161 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4btm7_93d95094-1954-4055-b057-9c94763afc6f/frr/0.log" Nov 22 10:25:05 crc kubenswrapper[4846]: I1122 10:25:05.350661 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtgbm_b4e18041-980a-4cbb-ba17-98b3f6032c57/speaker/0.log" Nov 22 10:25:17 crc kubenswrapper[4846]: I1122 10:25:17.930331 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/util/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.063058 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/util/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.096966 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/pull/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.122415 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/pull/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.325363 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/util/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.394902 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/extract/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.395583 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772ecsjvx_f112ec7f-7ff7-4205-a2c9-331d34530c5a/pull/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.504324 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-utilities/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.671765 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-utilities/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.708939 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-content/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.717769 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-content/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.874596 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-utilities/0.log" Nov 22 10:25:18 crc kubenswrapper[4846]: I1122 10:25:18.919654 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/extract-content/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.081973 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-utilities/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.322317 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-utilities/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.369684 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-content/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.381749 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-content/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.487480 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-utilities/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.528996 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xw7p_9edf4077-fb2b-42e9-8fb4-089d14519da9/registry-server/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.572176 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/extract-content/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.749898 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/util/0.log" Nov 22 10:25:19 crc kubenswrapper[4846]: I1122 10:25:19.994281 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/pull/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.042814 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/util/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.056206 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/pull/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.172205 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8nbgq_38303d50-92e8-4134-9869-52964f9d76f0/registry-server/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.270458 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/pull/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.299957 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/util/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.319778 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lt54w_3743cbee-9a49-40c8-bdae-7913ec94b4d1/extract/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.499526 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2lfzs_b2d91bbe-e29e-4a12-a7a8-92c26c4a977b/marketplace-operator/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.525308 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-utilities/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.678246 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-utilities/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.712409 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-content/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.733572 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-content/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.861905 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-utilities/0.log" Nov 22 10:25:20 crc kubenswrapper[4846]: I1122 10:25:20.867872 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/extract-content/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.028552 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-utilities/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.055340 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bxljw_17186de5-6faa-416f-a138-e32ed89d2ad5/registry-server/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.216875 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-utilities/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.222795 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-content/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.230980 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-content/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.407881 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-content/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.412330 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/extract-utilities/0.log" Nov 22 10:25:21 crc kubenswrapper[4846]: I1122 10:25:21.933658 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z55r4_256ed2d8-7444-4001-ae7c-2592adcb4e72/registry-server/0.log" Nov 22 10:25:28 crc kubenswrapper[4846]: I1122 10:25:28.625789 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:25:28 crc kubenswrapper[4846]: I1122 10:25:28.626601 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:25:47 crc kubenswrapper[4846]: E1122 10:25:47.397913 4846 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.107:43018->38.129.56.107:32975: write tcp 38.129.56.107:43018->38.129.56.107:32975: write: broken pipe Nov 22 10:25:58 crc kubenswrapper[4846]: I1122 10:25:58.625135 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:25:58 crc kubenswrapper[4846]: I1122 10:25:58.625679 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:26:28 crc kubenswrapper[4846]: I1122 10:26:28.625223 4846 patch_prober.go:28] interesting pod/machine-config-daemon-c59mw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:26:28 crc kubenswrapper[4846]: I1122 10:26:28.625811 4846 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:26:28 crc kubenswrapper[4846]: I1122 10:26:28.625848 4846 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" Nov 22 10:26:28 crc kubenswrapper[4846]: I1122 10:26:28.626550 4846 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af"} pod="openshift-machine-config-operator/machine-config-daemon-c59mw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:26:28 crc kubenswrapper[4846]: I1122 10:26:28.626602 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" containerName="machine-config-daemon" containerID="cri-o://6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" gracePeriod=600 Nov 22 10:26:28 crc kubenswrapper[4846]: E1122 10:26:28.752406 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:26:29 crc kubenswrapper[4846]: I1122 10:26:29.257950 4846 generic.go:334] "Generic (PLEG): container finished" podID="86a01cc5-5438-4978-8919-2d24f665922a" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" exitCode=0 Nov 22 10:26:29 crc kubenswrapper[4846]: I1122 10:26:29.258286 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" event={"ID":"86a01cc5-5438-4978-8919-2d24f665922a","Type":"ContainerDied","Data":"6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af"} Nov 22 10:26:29 crc kubenswrapper[4846]: I1122 10:26:29.258320 4846 scope.go:117] "RemoveContainer" containerID="e3447fd2376a9e62f224b7e26d25446dcd902e6769cfc416d5b85222bd3cdb68" Nov 22 10:26:29 crc kubenswrapper[4846]: I1122 10:26:29.259362 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:26:29 crc kubenswrapper[4846]: E1122 10:26:29.260020 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:26:42 crc kubenswrapper[4846]: I1122 10:26:42.036371 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:26:42 crc kubenswrapper[4846]: E1122 10:26:42.037261 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:26:57 crc kubenswrapper[4846]: I1122 10:26:57.035981 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:26:57 crc kubenswrapper[4846]: E1122 10:26:57.037253 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:27:01 crc kubenswrapper[4846]: I1122 10:27:01.642645 4846 generic.go:334] "Generic (PLEG): container finished" podID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerID="a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a" exitCode=0 Nov 22 10:27:01 crc kubenswrapper[4846]: I1122 10:27:01.642747 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" event={"ID":"73e5c479-31c4-446d-89ff-dae0c3bc674c","Type":"ContainerDied","Data":"a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a"} Nov 22 10:27:01 crc kubenswrapper[4846]: I1122 10:27:01.644888 4846 scope.go:117] "RemoveContainer" containerID="a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a" Nov 22 10:27:02 crc kubenswrapper[4846]: I1122 10:27:02.014357 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5tk6n_must-gather-2g9sk_73e5c479-31c4-446d-89ff-dae0c3bc674c/gather/0.log" Nov 22 10:27:10 crc kubenswrapper[4846]: I1122 10:27:10.035838 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:27:10 crc kubenswrapper[4846]: E1122 10:27:10.038031 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:27:11 crc kubenswrapper[4846]: I1122 10:27:11.818529 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5tk6n/must-gather-2g9sk"] Nov 22 10:27:11 crc kubenswrapper[4846]: I1122 10:27:11.819023 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="copy" containerID="cri-o://43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f" gracePeriod=2 Nov 22 10:27:11 crc kubenswrapper[4846]: I1122 10:27:11.827925 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5tk6n/must-gather-2g9sk"] Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.413896 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5tk6n_must-gather-2g9sk_73e5c479-31c4-446d-89ff-dae0c3bc674c/copy/0.log" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.414686 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.452185 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73e5c479-31c4-446d-89ff-dae0c3bc674c-must-gather-output\") pod \"73e5c479-31c4-446d-89ff-dae0c3bc674c\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.452256 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv7cr\" (UniqueName: \"kubernetes.io/projected/73e5c479-31c4-446d-89ff-dae0c3bc674c-kube-api-access-nv7cr\") pod \"73e5c479-31c4-446d-89ff-dae0c3bc674c\" (UID: \"73e5c479-31c4-446d-89ff-dae0c3bc674c\") " Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.461284 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e5c479-31c4-446d-89ff-dae0c3bc674c-kube-api-access-nv7cr" (OuterVolumeSpecName: "kube-api-access-nv7cr") pod "73e5c479-31c4-446d-89ff-dae0c3bc674c" (UID: "73e5c479-31c4-446d-89ff-dae0c3bc674c"). InnerVolumeSpecName "kube-api-access-nv7cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.556387 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv7cr\" (UniqueName: \"kubernetes.io/projected/73e5c479-31c4-446d-89ff-dae0c3bc674c-kube-api-access-nv7cr\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.646688 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e5c479-31c4-446d-89ff-dae0c3bc674c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "73e5c479-31c4-446d-89ff-dae0c3bc674c" (UID: "73e5c479-31c4-446d-89ff-dae0c3bc674c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.658074 4846 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73e5c479-31c4-446d-89ff-dae0c3bc674c-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.749225 4846 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5tk6n_must-gather-2g9sk_73e5c479-31c4-446d-89ff-dae0c3bc674c/copy/0.log" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.750106 4846 generic.go:334] "Generic (PLEG): container finished" podID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerID="43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f" exitCode=143 Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.750167 4846 scope.go:117] "RemoveContainer" containerID="43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.750274 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5tk6n/must-gather-2g9sk" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.768738 4846 scope.go:117] "RemoveContainer" containerID="a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.854725 4846 scope.go:117] "RemoveContainer" containerID="43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f" Nov 22 10:27:12 crc kubenswrapper[4846]: E1122 10:27:12.855183 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f\": container with ID starting with 43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f not found: ID does not exist" containerID="43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.855281 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f"} err="failed to get container status \"43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f\": rpc error: code = NotFound desc = could not find container \"43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f\": container with ID starting with 43e21f9c469bc8bf5547fbcd8cba77b1b2823893906709f57228bed5da0f030f not found: ID does not exist" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.855345 4846 scope.go:117] "RemoveContainer" containerID="a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a" Nov 22 10:27:12 crc kubenswrapper[4846]: E1122 10:27:12.856682 4846 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a\": container with ID starting with a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a not found: ID does not exist" containerID="a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a" Nov 22 10:27:12 crc kubenswrapper[4846]: I1122 10:27:12.856739 4846 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a"} err="failed to get container status \"a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a\": rpc error: code = NotFound desc = could not find container \"a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a\": container with ID starting with a960a0012f32cce8911b0341a9224e837bab37cc217bd203fd3d4fe23222a78a not found: ID does not exist" Nov 22 10:27:14 crc kubenswrapper[4846]: I1122 10:27:14.050002 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" path="/var/lib/kubelet/pods/73e5c479-31c4-446d-89ff-dae0c3bc674c/volumes" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.330427 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bsdcr"] Nov 22 10:27:21 crc kubenswrapper[4846]: E1122 10:27:21.331721 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="gather" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.331741 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="gather" Nov 22 10:27:21 crc kubenswrapper[4846]: E1122 10:27:21.331756 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031f5331-896b-45e3-b973-b03ab16eb967" containerName="container-00" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.331764 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="031f5331-896b-45e3-b973-b03ab16eb967" containerName="container-00" Nov 22 10:27:21 crc kubenswrapper[4846]: E1122 10:27:21.331784 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="copy" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.331791 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="copy" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.332144 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="gather" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.332161 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="031f5331-896b-45e3-b973-b03ab16eb967" containerName="container-00" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.332181 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e5c479-31c4-446d-89ff-dae0c3bc674c" containerName="copy" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.333763 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.355255 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsdcr"] Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.424868 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-catalog-content\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.424958 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-utilities\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.425088 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jfl\" (UniqueName: \"kubernetes.io/projected/23a64585-1bea-491c-bfd9-67aa22eef780-kube-api-access-s4jfl\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.527103 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jfl\" (UniqueName: \"kubernetes.io/projected/23a64585-1bea-491c-bfd9-67aa22eef780-kube-api-access-s4jfl\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.527289 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-catalog-content\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.527324 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-utilities\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.527862 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-utilities\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.528508 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-catalog-content\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.556012 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jfl\" (UniqueName: \"kubernetes.io/projected/23a64585-1bea-491c-bfd9-67aa22eef780-kube-api-access-s4jfl\") pod \"certified-operators-bsdcr\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:21 crc kubenswrapper[4846]: I1122 10:27:21.675427 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:22 crc kubenswrapper[4846]: I1122 10:27:22.239328 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bsdcr"] Nov 22 10:27:22 crc kubenswrapper[4846]: I1122 10:27:22.854236 4846 generic.go:334] "Generic (PLEG): container finished" podID="23a64585-1bea-491c-bfd9-67aa22eef780" containerID="a0bec480b6c9423065e095db28bc6702d47af2fab71c9b5c1652d0fb26b8941d" exitCode=0 Nov 22 10:27:22 crc kubenswrapper[4846]: I1122 10:27:22.854311 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsdcr" event={"ID":"23a64585-1bea-491c-bfd9-67aa22eef780","Type":"ContainerDied","Data":"a0bec480b6c9423065e095db28bc6702d47af2fab71c9b5c1652d0fb26b8941d"} Nov 22 10:27:22 crc kubenswrapper[4846]: I1122 10:27:22.854634 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsdcr" event={"ID":"23a64585-1bea-491c-bfd9-67aa22eef780","Type":"ContainerStarted","Data":"54d7ef7302029d91a8413086b62bf63b26aa754aa17e0759fc77fe2dfe366100"} Nov 22 10:27:22 crc kubenswrapper[4846]: I1122 10:27:22.856275 4846 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:27:23 crc kubenswrapper[4846]: I1122 10:27:23.872567 4846 generic.go:334] "Generic (PLEG): container finished" podID="23a64585-1bea-491c-bfd9-67aa22eef780" containerID="066ef8cc150415072a0e5d0927dc28be542b9090fbf1f9ff35b9c6dc225c6722" exitCode=0 Nov 22 10:27:23 crc kubenswrapper[4846]: I1122 10:27:23.872630 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsdcr" event={"ID":"23a64585-1bea-491c-bfd9-67aa22eef780","Type":"ContainerDied","Data":"066ef8cc150415072a0e5d0927dc28be542b9090fbf1f9ff35b9c6dc225c6722"} Nov 22 10:27:24 crc kubenswrapper[4846]: I1122 10:27:24.886775 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsdcr" event={"ID":"23a64585-1bea-491c-bfd9-67aa22eef780","Type":"ContainerStarted","Data":"6be8692660d0b3e452eacfa25a2a24bf1b99dd7c2d1785f3a0b57ea750c50d09"} Nov 22 10:27:24 crc kubenswrapper[4846]: I1122 10:27:24.914944 4846 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bsdcr" podStartSLOduration=2.41565009 podStartE2EDuration="3.91492605s" podCreationTimestamp="2025-11-22 10:27:21 +0000 UTC" firstStartedPulling="2025-11-22 10:27:22.855836661 +0000 UTC m=+4417.791526340" lastFinishedPulling="2025-11-22 10:27:24.355112651 +0000 UTC m=+4419.290802300" observedRunningTime="2025-11-22 10:27:24.914129408 +0000 UTC m=+4419.849819087" watchObservedRunningTime="2025-11-22 10:27:24.91492605 +0000 UTC m=+4419.850615719" Nov 22 10:27:25 crc kubenswrapper[4846]: I1122 10:27:25.035334 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:27:25 crc kubenswrapper[4846]: E1122 10:27:25.035690 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:27:31 crc kubenswrapper[4846]: I1122 10:27:31.675785 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:31 crc kubenswrapper[4846]: I1122 10:27:31.677177 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:31 crc kubenswrapper[4846]: I1122 10:27:31.740805 4846 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:32 crc kubenswrapper[4846]: I1122 10:27:32.101544 4846 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:32 crc kubenswrapper[4846]: I1122 10:27:32.147692 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsdcr"] Nov 22 10:27:33 crc kubenswrapper[4846]: I1122 10:27:33.983706 4846 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bsdcr" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="registry-server" containerID="cri-o://6be8692660d0b3e452eacfa25a2a24bf1b99dd7c2d1785f3a0b57ea750c50d09" gracePeriod=2 Nov 22 10:27:34 crc kubenswrapper[4846]: I1122 10:27:34.995192 4846 generic.go:334] "Generic (PLEG): container finished" podID="23a64585-1bea-491c-bfd9-67aa22eef780" containerID="6be8692660d0b3e452eacfa25a2a24bf1b99dd7c2d1785f3a0b57ea750c50d09" exitCode=0 Nov 22 10:27:34 crc kubenswrapper[4846]: I1122 10:27:34.995464 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsdcr" event={"ID":"23a64585-1bea-491c-bfd9-67aa22eef780","Type":"ContainerDied","Data":"6be8692660d0b3e452eacfa25a2a24bf1b99dd7c2d1785f3a0b57ea750c50d09"} Nov 22 10:27:34 crc kubenswrapper[4846]: I1122 10:27:34.995762 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bsdcr" event={"ID":"23a64585-1bea-491c-bfd9-67aa22eef780","Type":"ContainerDied","Data":"54d7ef7302029d91a8413086b62bf63b26aa754aa17e0759fc77fe2dfe366100"} Nov 22 10:27:34 crc kubenswrapper[4846]: I1122 10:27:34.995781 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d7ef7302029d91a8413086b62bf63b26aa754aa17e0759fc77fe2dfe366100" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.014110 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.116478 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4jfl\" (UniqueName: \"kubernetes.io/projected/23a64585-1bea-491c-bfd9-67aa22eef780-kube-api-access-s4jfl\") pod \"23a64585-1bea-491c-bfd9-67aa22eef780\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.142506 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a64585-1bea-491c-bfd9-67aa22eef780-kube-api-access-s4jfl" (OuterVolumeSpecName: "kube-api-access-s4jfl") pod "23a64585-1bea-491c-bfd9-67aa22eef780" (UID: "23a64585-1bea-491c-bfd9-67aa22eef780"). InnerVolumeSpecName "kube-api-access-s4jfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.220134 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-utilities\") pod \"23a64585-1bea-491c-bfd9-67aa22eef780\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.220357 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-catalog-content\") pod \"23a64585-1bea-491c-bfd9-67aa22eef780\" (UID: \"23a64585-1bea-491c-bfd9-67aa22eef780\") " Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.221257 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4jfl\" (UniqueName: \"kubernetes.io/projected/23a64585-1bea-491c-bfd9-67aa22eef780-kube-api-access-s4jfl\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.221549 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-utilities" (OuterVolumeSpecName: "utilities") pod "23a64585-1bea-491c-bfd9-67aa22eef780" (UID: "23a64585-1bea-491c-bfd9-67aa22eef780"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.308625 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23a64585-1bea-491c-bfd9-67aa22eef780" (UID: "23a64585-1bea-491c-bfd9-67aa22eef780"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.323565 4846 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:35 crc kubenswrapper[4846]: I1122 10:27:35.323609 4846 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23a64585-1bea-491c-bfd9-67aa22eef780-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:36 crc kubenswrapper[4846]: I1122 10:27:36.005538 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bsdcr" Nov 22 10:27:36 crc kubenswrapper[4846]: I1122 10:27:36.074217 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bsdcr"] Nov 22 10:27:36 crc kubenswrapper[4846]: I1122 10:27:36.084092 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bsdcr"] Nov 22 10:27:38 crc kubenswrapper[4846]: I1122 10:27:38.036367 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:27:38 crc kubenswrapper[4846]: E1122 10:27:38.038327 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:27:38 crc kubenswrapper[4846]: I1122 10:27:38.057929 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" path="/var/lib/kubelet/pods/23a64585-1bea-491c-bfd9-67aa22eef780/volumes" Nov 22 10:27:51 crc kubenswrapper[4846]: I1122 10:27:51.035394 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:27:51 crc kubenswrapper[4846]: E1122 10:27:51.036772 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:28:06 crc kubenswrapper[4846]: I1122 10:28:06.051240 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:28:06 crc kubenswrapper[4846]: E1122 10:28:06.052437 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:28:17 crc kubenswrapper[4846]: I1122 10:28:17.036001 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:28:17 crc kubenswrapper[4846]: E1122 10:28:17.036919 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:28:29 crc kubenswrapper[4846]: I1122 10:28:29.035416 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:28:29 crc kubenswrapper[4846]: E1122 10:28:29.036215 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:28:43 crc kubenswrapper[4846]: I1122 10:28:43.036198 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:28:43 crc kubenswrapper[4846]: E1122 10:28:43.037227 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:28:57 crc kubenswrapper[4846]: I1122 10:28:57.034999 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:28:57 crc kubenswrapper[4846]: E1122 10:28:57.035831 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:29:10 crc kubenswrapper[4846]: I1122 10:29:10.035435 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:29:10 crc kubenswrapper[4846]: E1122 10:29:10.037490 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:29:23 crc kubenswrapper[4846]: I1122 10:29:23.036164 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:29:23 crc kubenswrapper[4846]: E1122 10:29:23.037463 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:29:38 crc kubenswrapper[4846]: I1122 10:29:38.035162 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:29:38 crc kubenswrapper[4846]: E1122 10:29:38.035875 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:29:53 crc kubenswrapper[4846]: I1122 10:29:53.034776 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:29:53 crc kubenswrapper[4846]: E1122 10:29:53.035975 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.142690 4846 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq"] Nov 22 10:30:00 crc kubenswrapper[4846]: E1122 10:30:00.145187 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="extract-utilities" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.145342 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="extract-utilities" Nov 22 10:30:00 crc kubenswrapper[4846]: E1122 10:30:00.145478 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="registry-server" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.145589 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="registry-server" Nov 22 10:30:00 crc kubenswrapper[4846]: E1122 10:30:00.145760 4846 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="extract-content" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.145874 4846 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="extract-content" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.146295 4846 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a64585-1bea-491c-bfd9-67aa22eef780" containerName="registry-server" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.147441 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.149444 4846 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.150788 4846 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.159710 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq"] Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.282911 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dft46\" (UniqueName: \"kubernetes.io/projected/7d67686d-778f-44f5-8d89-aa949e8bae6f-kube-api-access-dft46\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.283033 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d67686d-778f-44f5-8d89-aa949e8bae6f-secret-volume\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.283099 4846 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d67686d-778f-44f5-8d89-aa949e8bae6f-config-volume\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.384279 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dft46\" (UniqueName: \"kubernetes.io/projected/7d67686d-778f-44f5-8d89-aa949e8bae6f-kube-api-access-dft46\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.384416 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d67686d-778f-44f5-8d89-aa949e8bae6f-secret-volume\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.384469 4846 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d67686d-778f-44f5-8d89-aa949e8bae6f-config-volume\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.385406 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d67686d-778f-44f5-8d89-aa949e8bae6f-config-volume\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.390862 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d67686d-778f-44f5-8d89-aa949e8bae6f-secret-volume\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.402880 4846 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dft46\" (UniqueName: \"kubernetes.io/projected/7d67686d-778f-44f5-8d89-aa949e8bae6f-kube-api-access-dft46\") pod \"collect-profiles-29396790-ljwzq\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:00 crc kubenswrapper[4846]: I1122 10:30:00.471288 4846 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:01 crc kubenswrapper[4846]: I1122 10:30:00.939601 4846 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq"] Nov 22 10:30:01 crc kubenswrapper[4846]: I1122 10:30:01.722181 4846 generic.go:334] "Generic (PLEG): container finished" podID="7d67686d-778f-44f5-8d89-aa949e8bae6f" containerID="32a74f3c40e6ce127dcc14d23bbfea8b07161c3a2a616daf1ee293b0ff07a903" exitCode=0 Nov 22 10:30:01 crc kubenswrapper[4846]: I1122 10:30:01.722305 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" event={"ID":"7d67686d-778f-44f5-8d89-aa949e8bae6f","Type":"ContainerDied","Data":"32a74f3c40e6ce127dcc14d23bbfea8b07161c3a2a616daf1ee293b0ff07a903"} Nov 22 10:30:01 crc kubenswrapper[4846]: I1122 10:30:01.722478 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" event={"ID":"7d67686d-778f-44f5-8d89-aa949e8bae6f","Type":"ContainerStarted","Data":"664fee63693c12136b3c5266021e60833f1602bd3c37d24b421d4b1993191aae"} Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.163534 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.347180 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d67686d-778f-44f5-8d89-aa949e8bae6f-secret-volume\") pod \"7d67686d-778f-44f5-8d89-aa949e8bae6f\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.347245 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d67686d-778f-44f5-8d89-aa949e8bae6f-config-volume\") pod \"7d67686d-778f-44f5-8d89-aa949e8bae6f\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.347326 4846 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dft46\" (UniqueName: \"kubernetes.io/projected/7d67686d-778f-44f5-8d89-aa949e8bae6f-kube-api-access-dft46\") pod \"7d67686d-778f-44f5-8d89-aa949e8bae6f\" (UID: \"7d67686d-778f-44f5-8d89-aa949e8bae6f\") " Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.348610 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d67686d-778f-44f5-8d89-aa949e8bae6f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d67686d-778f-44f5-8d89-aa949e8bae6f" (UID: "7d67686d-778f-44f5-8d89-aa949e8bae6f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.354268 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d67686d-778f-44f5-8d89-aa949e8bae6f-kube-api-access-dft46" (OuterVolumeSpecName: "kube-api-access-dft46") pod "7d67686d-778f-44f5-8d89-aa949e8bae6f" (UID: "7d67686d-778f-44f5-8d89-aa949e8bae6f"). InnerVolumeSpecName "kube-api-access-dft46". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.355064 4846 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d67686d-778f-44f5-8d89-aa949e8bae6f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d67686d-778f-44f5-8d89-aa949e8bae6f" (UID: "7d67686d-778f-44f5-8d89-aa949e8bae6f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.450080 4846 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d67686d-778f-44f5-8d89-aa949e8bae6f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.450114 4846 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d67686d-778f-44f5-8d89-aa949e8bae6f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.450129 4846 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dft46\" (UniqueName: \"kubernetes.io/projected/7d67686d-778f-44f5-8d89-aa949e8bae6f-kube-api-access-dft46\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.743368 4846 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" event={"ID":"7d67686d-778f-44f5-8d89-aa949e8bae6f","Type":"ContainerDied","Data":"664fee63693c12136b3c5266021e60833f1602bd3c37d24b421d4b1993191aae"} Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.743417 4846 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-ljwzq" Nov 22 10:30:03 crc kubenswrapper[4846]: I1122 10:30:03.743427 4846 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664fee63693c12136b3c5266021e60833f1602bd3c37d24b421d4b1993191aae" Nov 22 10:30:04 crc kubenswrapper[4846]: I1122 10:30:04.262340 4846 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6"] Nov 22 10:30:04 crc kubenswrapper[4846]: I1122 10:30:04.271639 4846 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-6v8z6"] Nov 22 10:30:06 crc kubenswrapper[4846]: I1122 10:30:06.057993 4846 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9" path="/var/lib/kubelet/pods/43b6dcaf-4a3d-4bf1-a050-e62dc7f4b2a9/volumes" Nov 22 10:30:07 crc kubenswrapper[4846]: I1122 10:30:07.036514 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:30:07 crc kubenswrapper[4846]: E1122 10:30:07.037334 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:30:21 crc kubenswrapper[4846]: I1122 10:30:21.036271 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:30:21 crc kubenswrapper[4846]: E1122 10:30:21.037872 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:30:27 crc kubenswrapper[4846]: I1122 10:30:27.610915 4846 scope.go:117] "RemoveContainer" containerID="f1d9c6f83991ffffd979d00041bb0d3685f477fbc4d195adb605c3b86e04d95b" Nov 22 10:30:36 crc kubenswrapper[4846]: I1122 10:30:36.049681 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:30:36 crc kubenswrapper[4846]: E1122 10:30:36.050649 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:30:50 crc kubenswrapper[4846]: I1122 10:30:50.036272 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:30:50 crc kubenswrapper[4846]: E1122 10:30:50.037920 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:31:01 crc kubenswrapper[4846]: I1122 10:31:01.036282 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:31:01 crc kubenswrapper[4846]: E1122 10:31:01.037455 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a" Nov 22 10:31:12 crc kubenswrapper[4846]: I1122 10:31:12.035696 4846 scope.go:117] "RemoveContainer" containerID="6ef316fdcd6522a36d06d6d641dfe5b3474ee0f6dcc1f3852ee078c88d68e9af" Nov 22 10:31:12 crc kubenswrapper[4846]: E1122 10:31:12.037126 4846 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c59mw_openshift-machine-config-operator(86a01cc5-5438-4978-8919-2d24f665922a)\"" pod="openshift-machine-config-operator/machine-config-daemon-c59mw" podUID="86a01cc5-5438-4978-8919-2d24f665922a"